2402ss03

2402ss03.doc

Willingness to Pay Survey for Section 316(b) Existing Facilities Cooling Water Intake Structures

OMB: 2040-0283

Document [doc]
Download: doc | pdf





Supporting Statement for Information Collection Request for

Willingness to Pay Survey for §316(b) Existing Facilities Cooling Water Intake Structures: Instrument, Pre-test, and Implementation




TABLE OF CONTENTS


PART A OF THE SUPPORTING STATEMENT 1


1. Identification of the Information Collection 1

1(a) Title of the Information Collection 1

1(b) Short Characterization (Abstract) 1


2. Need For and Use of the Collection 4

2(a) Need/Authority for the Collection 4

2(b) Practical Utility/Users of the Data 4


3. Non-duplication, Consultations, and Other Collection Criteria 5

3(a) Non-duplication 5

3(b) Public Notice Required Prior to ICR Submission to OMB 6

3(c) Consultations 11

3(d) Effects of Less Frequent Collection 12

3(e) General Guidelines 12

3(f) Confidentiality 12

3(g) Sensitive Questions 12


4. The Respondents and the Information Requested 12

4(a) Respondents 12

4(b) Information Requested 14

(I) Data items, including recordkeeping requirements 14

(II) Respondent activities 17


5. The Information Collected - Agency Activities, Collection Methodology, and Information Management 18

5(a) Agency Activities 18

5(b) Collection Methodology and Information Management 19

5(c) Small Entity Flexibility 20

5(d) Collection Schedule 20


6. Estimating Respondent Burden and Cost of Collection 21

6(a) Estimating Respondent Burden 21

6(b) Estimating Respondent Costs 21

6(c) Estimating Agency Burden and Costs 21

6(d) Respondent Universe and Total Burden Costs 23

6(e) Bottom Line Burden Hours and Costs 23

6(f) Reasons for Change in Burden 24

6(g) Burden Statement 24


PART B OF THE SUPPORTING STATEMENT 25


1. Survey Objectives, Key Variables, and Other Preliminaries 25

1(a) Survey Objectives 25

1(b) Key Variables 25

1(c) Statistical Approach 26

1(d) Feasibility 26


2. Survey Design 27

2(a) Target Population and Coverage 27

2(b) Sampling Design 27

(I) Sampling Frames 27

(II) Sample Sizes 27

(III) Stratification Variables 27

(IV) Sampling Method 28

(V) Multi-Stage Sampling 28

2(c) Precision Requirements 29

(I) Precision Targets 29

(II) Non-Sampling Errors 29

2(d) Questionnaire Design 32


3. Pretests and Pilot Tests 33


4. Collection Methods and Follow-up 35

4(a) Collection Methods 35

4(b) Survey Response and Follow-up 35


5. Analyzing and Reporting Survey Results 36

5(a) Data Preparation 36

5(b) Analysis 36

5(c) Reporting Results 51


REFERENCES 52


ATTACHMENTS


Attachment 1: Full Text of Regional Stated Preference Survey Component (Northeast Regional Example) 59

Attachment 2: Full Text of National Stated Preference Survey Component 79

Attachment 3: First Federal Register Notice 99

Attachment 4: Second Federal Register Notice 109

Attachment 5: Description of Statistical Survey Design 116

Attachment 6: Preview Letter to Mail Survey Recipients 131

Attachment 7: Cover Letter to Mail Survey Recipients 132

Attachment 8: Post Card Reminder to Mail Survey Recipients 133

Attachment 9: Cover Letter to Recipients of the Second Survey Mailing 134

Engineering and Analysis Division 134

Attachment 10: Reminder Letter to Mail Survey Recipients 135

Attachment 11: Letter to Participants in the Telephone Non-response Survey 136

Attachment 12: Cover Letter to Recipients of the Priority Mail Non-response Questionnaire 137

Attachment 13: Priority Mail Non-response Questionnaire 138

Attachment 14: Telephone Non-response Screener Script 142


TABLES


Table A1: Geographic Stratification Design 13

Table A2: Schedule for Survey Implementation 20

Table A3: Total Estimated Bottom Line Burden and Cost Summary 23

Table B1: Number of Households and Household Sample for Each EPA Study Region 29

Table B2: Number of Non-responding Households in the Priority Mail and Telephone Subsamples 32

Table B3: Illustration of an External Scope Test 49



PART A OF THE SUPPORTING STATEMENT


1. Identification of the Information Collection

1(a) Title of the Information Collection


Willingness to Pay Survey for Section 316(b) Existing Facilities Cooling Water Intake Structures: Instrument, Pre-test, and Implementation

1(b) Short Characterization (Abstract)


On February 16, 2004, the U.S. Environmental Protection Agency (EPA) took final action on the Phase II rule governing cooling water intake structures at existing facilities that are point sources; that, as their primary activity, both generate and transmit electric power or generate electric power for sale to another entity for transmission; that use or propose to use cooling water intake structures with a total design intake flow of 50 MGD or more to withdraw cooling water from waters of the United States; and that use at least 25 percent of the withdrawn water exclusively for cooling purposes. See 69 FR 41576 (July 9, 2004). Industry and environmental stakeholders challenged the Phase II regulations. On judicial review, the Second Circuit (Riverkeeper, Inc. v. EPA, 475 F.3d 83, (2d Cir., 2007)) remanded several provisions of the Phase II rule. Some key provisions remanded are as follows: EPA improperly used a cost-benefit analysis as a criterion for determining Best Technology Available (BTA), and EPA inappropriately used ranges in setting performance expectations. In response, EPA suspended the Phase II regulation in July 2007 pending further rulemaking. The U.S. Supreme Court granted Entergy Corporation’s petition for writ of certiorari, solely on the question of whether EPA had the authority under §316(b) of the Clean Water Act to consider costs and benefits in decision-making. On April 1, 2009, the Court, in Entergy Corp. v. Riverkeeper Inc., decided that “EPA permissibly relied on cost-benefit analysis in setting the national performance standards … as part of the Phase II regulations.” EPA is now taking a voluntary remand of the rule, thus ending Second Circuit review.

On June 1, 2006, EPA promulgated the 316(b) Phase III Rule for existing manufacturers, small flow power plants (facilities that use cooling water intake structures with a total design intake flow of less than 50 MGD to withdraw cooling water from waters of the United States; and that use at least 25 percent of the withdrawn water exclusively for cooling purposes), and new offshore oil and gas facilities. Offshore oil and gas firms and environmental groups petitioned for judicial review, which was to occur in the Fifth Circuit, but was stayed pending the Supreme Court decision on the Phase II case. EPA has petitioned the Court for a voluntary remand of the existing facilities portion of the Phase III rulemaking. In developing the Phase III regulation, EPA began, but did not complete, a similar stated preference survey effort. The current effort builds on that earlier work.

EPA is now combining the two phases into one rulemaking covering all existing facilities. EPA will develop regulations to provide national performance standards for controlling impacts from existing cooling water intake structures (CWIS) under Section 316(b) of the Clean Water Act (CWA).

Under Executive Order 12866, EPA is required to estimate the potential benefits and costs to society for economically significant rules. To assess the public policy significance or importance of the ecological gains from the section 316(b) regulation for existing facilities, EPA requests approval from the Office of Management and Budget to conduct a stated preference survey. Data from the stated preference survey will be used to estimate the values (willingness to pay, or WTP) derived by households for changes related to the reduction of fish losses at CWIS, and to provide information to assist in the interpretation and validation of survey responses. As indicated in the prior literature (Cummings and Harrison 1995; Johnston et al. 2003a, 2005), it is virtually impossible to justify, theoretically, the decomposition of empirical estimates of use and non-use values. The survey will provide the flexibility, however, to estimate nonuser values, using various nonuser definitions drawn from responses to survey question 10. The structure of choice attribute questions will also allow the analysis to separate value components related to the most common sources of use values—effect on harvested recreational and commercial fish. In summary, the survey will provide estimates of total values (including use and nonuse), will allow estimates of value associated with specific choice attributes (following standard methods for choice experiments), and will also allow the flexibility to provide some insight into the relative importance of use versus non-use values in the 316(b) context.

Within rulemaking, among the most crucial concerns is the avoidance of benefit (or cost) double counting. Here, for example, the WTP estimates will include use and non-use values among a representative population sample. These may overlap—to a potentially substantial extent—with use value estimates that might be provided through some other methods, including revealed preference methods that might be used to estimate use values of recreational anglers for fish kill reductions (i.e., through related improvements in fishing quality). While using the proposed stated preference value estimates for benefit estimation, particular care will be given to avoid any possible double counting of values that might be derived from alternative valuation methods. In doing so, EPA will rely upon standard theoretical tools for non-market welfare analysis, as presented by authors including Freeman (2003) and Just et al. (2004). From a purely mechanistic perspective, survey results will be used to derive total values following standard practice for choice experiments (Adamowicz et al. 1998).

The target population for this stated preference survey is all individuals from continental U.S. households who are 18 years of age or older. The population of households will be stratified into four study regions: Northeast, Southeast, Inland, and Pacific. The Northeast survey region includes the North Atlantic and Mid Atlantic 316(b) benefits regions, the Southeast survey region includes the South Atlantic and Gulf of Mexico 316(b) benefits regions, the Pacific region includes states on the Pacific coast, and the Inland region includes all non-coastal states. In addition, EPA will administer a national version of the survey that does not require stratification. The sample of households in each region will be randomly selected from the U.S. Postal Service Delivery Sequence File (DSF), which covers over 97% of residences in the U.S. EPA intends to administer the mail survey to 7,628 households in order to achieve 2,288 completed responses assuming a 30% completion rate for selected households.

For the selection of households, the population of households in the 48 states and the District of Columbia will be stratified by the four study regions. The sample is allocated to each region in proportion to the total number of households in that region with the restriction that we get at least 288 completed responses in each region. A sample of 288 households completing the national survey version would be distributed among the study regions based on the percentage of regional survey sample to ensure that respondents to the national survey version are distributed across the continental U.S. Non-response bias has the potential to occur due to households failing to return a completed mail survey. EPA will use a combination of telephone and priority mailing to conduct a non-response study. EPA will analyze the characteristics of the completed and non-completed cases from the mail survey and non-response questionnaire to determine whether there is any evidence of significant non-response bias in the completed sample. This analysis will suggest whether any weighting or statistical adjustment is necessary to minimize the non-response bias in the completed sample.

As part of the testing of the survey instrument, EPA conducted a series 7 focus groups with 8-10 participants per focus group, with approval from the Office of Management and Budget (OMB control # 2090-0028). The Agency conducted focus groups in several regions to account for the potentially distinct information relevant to survey design. These focus groups were conducted following standard, accepted practices in the stated preference literature, as outlined by Mitchell and Carson (1989), Desvousges et al. (1984), Desvousges and Smith (1988) and Johnston et al. (1995). One of the focus groups incorporated individual cognitive interviews, as detailed by Kaplowicz et al. (2004). The focus groups and cognitive interviews allowed EPA to better understand the public's perceptions and attitudes concerning fishery resources, to frame and define survey questions, to pretest draft survey questions, to test for and reduce potential biases that may be associated with stated preference methodology, and to ensure that both researchers and respondents have similar interpretations of survey language and scenarios. In particular, cognitive interviews allowed for in-depth exploration of the cognitive processes used by respondents to answer survey questions, without the potential for interpersonal dynamics to sway respondents’ comments (Kaplowicz et al. 2004). Transcripts from these seven focus groups can be found in the docket for this ICR (ICR # 2402.01). EPA revised the survey based on the findings of the seven focus groups. These seven focus groups were conducted in addition to focus groups conducted previously by EPA under ICR #2155.01 to test the draft survey for the Phase III benefits analysis. Transcripts from the previously conducted focus groups for the Phase III analysis can be found in the docket for EPA ICR #2155.02 (Besedin et al., 2005). Findings from these previous focus groups were also incorporated into the development of the current survey.

The total national burden estimate for all components of the survey is 1,194 hours. The burden estimate is based on 2,288 respondents to the 7,628 mailed questionnaires and 600 respondents to the combined telephone and priority mail non-response survey. EPA assumes an average burden estimate of 30 minutes per mail survey respondent including the time necessary to complete and mail back the questionnaire and 5 minutes for each participant in the non-response survey. Given an average wage rate of $20.42, the total respondent cost is $24,381.


2. Need For and Use of the Collection

2(a) Need/Authority for the Collection


The project is being undertaken pursuant to section 104 of the Clean Water Act dealing with research. Section 104 of the Clean Water Act authorizes and directs the EPA Administrator to conduct research into a number of subject areas related to water quality, water pollution, and water pollution prevention and abatement. This section also authorizes the EPA Administrator to conduct research into methods of analyzing the costs and benefits of programs carried out under the Clean Water Act.

This project is exploring how public values for fishery resources are affected by fish losses from impingement and entrainment (I&E) mortality at cooling water intake structures. Understanding total public values for fishery resources, including the more difficult to estimate non-use values1, is necessary to determine the full range of benefits associated with reductions in I&E mortality losses, and whether the benefits of government action to reduce I&E mortality losses at existing facilities are commensurate with the costs of such actions. Because non-use values may be substantial, failure to recognize such values may lead to improper inferences regarding benefits and costs. The findings from this study will be used by EPA to improve estimates of the economic benefits of the section 316(b) regulation for existing facilities as required under Executive Order 12866.


2(b) Practical Utility/Users of the Data

EPA plans to use the results of the survey to improve estimates of the economic benefits of the section 316(b) regulation for existing facilities. Specifically, the Agency will use the survey results to estimate total values for preventing losses of fish through I&E mortality at CWIS, following standard practices outlined in the literature (Freeman 2003; Bennett and Blamey 2001; Louviere et al. 2000; U.S. EPA 2000).


3. Non-duplication, Consultations, and Other Collection Criteria

3(a) Non-duplication


There are many studies in the environmental economics literature that quantify benefits or willingness to pay (WTP) associated with various types of water quality and aquatic habitat changes. However, none of these studies allows the isolation of non-market WTP associated with quantified reductions in fish losses for forage fish. Most available studies estimate WTP for broader, and sometimes ambiguously defined, policies that simultaneously influence many different aspects of aquatic environmental quality and ecosystem services, but for which WTP associated with fish or aquatic life alone cannot be identified. Other studies provide benefit estimates associated with improvements in fish (or aquatic) habitat, but do not link this to well-defined and quantified changes in affected or supported organisms. Still other studies address willingness to pay for changes in charismatic or recreational species that have little relationship to the types of forage fish that are the vast majority of species affected by cooling water intake structures.

For example, choice experiment studies such as Hanley et al. (2006a, 2006b) and Morrison and Bennett (2004) estimate WTP for aquatic ecosystem changes that affect fish, but the effects on fish are quantified and valued solely in terms of the presence/absence of different types of fish species. This approach renders associated results unsuitable for 316(b) benefit estimation. Also, many of these studies were conducted outside the U.S. (e.g., the European Union or Australia), making their use for benefit transfer to a U.S. policy context more challenging.

Other studies have estimated the value of changes in catch rates or populations of select recreational and commercial species, charismatic species such as salmon, or changes in water quality that affect fish, but none have specifically valued changes in forage fish populations. For example, Olsen et al. (1991) conducted a survey of Pacific Northwest residents, including both anglers and non-anglers, to determine their WTP for doubling the size of the Columbia River Basin salmon and steelhead runs. EPA’s proposed survey approach differs from this study and others like it (such as Cameron and Huppert 1989) in that it would include respondents from various geographic regions in the United States and would provide values for the full range of forage, recreational, and commercial species affected by 316(b) regulations, instead of valuing a few recreational species in one specific geographical area.

Among available studies, the most closely related is Johnston et al. (2010), which estimates total willingness to pay (WTP) for multi-attribute aquatic ecosystem changes related to improvements in forage fish in Rhode Island. Unlike other studies, the choice experiment data of Johnston et al. (2010) allow estimation of WTP associated with quantified changes in forage fish (e.g., WTP per fish or percentage change in fish), holding other ecological effects constant. That is, unlike results provided by other studies in the literature, WTP estimates of Johnston et al. (2010) are not confounded with values for other changes including water quality, habitat, overall ecological condition, charisma of species, etc. In addition, the choice experiment of Johnston et al. (2010) addresses species such as alewife and blueback herring that are neither subject to recreational or commercial harvest in Rhode Island, nor are charismatic species. Hence, the species affected are a close analog to the forage fish affected in the 316(b) policy context.

Although the methods and data of Johnston et al. (2010) allow estimation of total values associated with specific improvements in forage and/or recreational fish, the policy context and scale of the survey prevent its direct use for analysis of national benefits of the 316(b) regulation. Specifically, Johnston et al. (2010) estimate Rhode Island residents’ preferences for the restoration of migratory fish passage over dams in the Pawtuxet and Wood-Pawcatuck watersheds. Hence, the case study is for a watershed-level policy with statewide welfare implications. In contrast, 316(b) policies would have nationwide implications, both on ecosystems and on affected facilities.


3(b) Public Notice Required Prior to ICR Submission to OMB


In accordance with the Paperwork Reduction Act (44 U.S.C. 3501 et seq.), EPA published two notices in the Federal Register on July 21, 2010 and January 21, 2011, announcing that the survey questionnaire and sampling methodology were available for comment. Copies of the first Federal Register notice (74 FR 42438) and second Federal Register notice (76 FR 3883) are attached at the end of this document (See Attachments 3 and 4, respectively). EPA received a number of comments on the proposed information collection, which are summarized in the following paragraphs. Also see docket # EPA-HQ-OW-2010-0595. EPA considered relevant comments on the draft survey when developing the survey questionnaire and sampling methodology for the current survey for existing facilities.

Some commenters expressed concern that the draft survey questionnaire and sampling methodology would not provide accurate estimates of WTP. Some stated that the proposed stated preference survey would overestimate WTP to prevent fish losses. Another commenter argued the opposite: that the proposed contingent valuation survey is biased against protecting ecosystems, and will drastically undervalue non-use benefits. EPA agreed that certain details of the stated preference survey and supporting documentation for the July 21, 2010 and January 21, 2011 Federal Register notices required revision. EPA has revised the survey instrument and supporting documentation accordingly. Following OMB approval of the focus groups, EPA conducted seven focus groups (including one set of cognitive interviews) to pretest the draft survey materials, to test for and reduce potential biases that may be associated with stated preference methodology, and to ensure that both researchers and respondents have similar interpretations of survey language and scenarios. As a result of this extensive pre-testing, a number of revisions were made to the draft survey that significantly improved its reliability and reduced its potential for bias. Hence, many of the survey design elements on which commenters took issue have already been removed or changed. In many other instances, however, focus groups showed little cause for concern, suggesting that many of the speculative claims raised by commenters have little value to the population being surveyed. EPA also notes that number of focus groups and interviews conducted for this survey—and the draft survey tested in 2005-06—far exceeds the number conducted for most stated preference surveys found in the published literature.

EPA notes that the survey proposed in this ICR is different in many ways from the draft survey for the Phase III benefits analysis that was peer reviewed in 2005 (Versar 2006). While findings from pre-testing of the previous draft survey were considered when developing the present survey, the present survey has undergone various revisions based on additional analysis and the results of recent focus groups. Due to these differences, EPA notes that many of the peer review comments received on the Phase III draft survey are no longer relevant for the survey proposed in this ICR. The Agency, however, takes all peer review comments very seriously, and has accounted for these comments in all survey revisions and in the development of the present stated preference survey.

One commenter argued that in conducting the stated preference survey, EPA should use the willingness-to-accept (WTA) metric in place of or in addition to WTP. EPA notes that good practice guidelines for stated preference surveys almost universally indicate the use of WTP elicitation mechanisms over WTA elicitation mechanisms. This is due to the potential for biases in WTA stated preference surveys that can be ameliorated by the use of the WTP format. WTP is also considered to be the more conservative choice, but in most cases, the divergence between WTA and WTP, as predicted, by theory, should be very small. EPA follows standard practice in proposing a WTP format in order to avoid these biases, comply with guidance and practice in the stated preference literature, and ensure a conservative benefit estimate.

Some commenters argued that hypothetical bias in the survey questionnaire would inhibit respondent’s ability to provide meaningful survey responses. EPA agrees that hypothetical bias is an important concern with stated preference surveys—and has taken this very seriously in survey development and testing. EPA does not agree that this inhibits respondents or that the literature suggests that hypothetical bias is unavoidable; in fact, the published literature includes approaches to mitigate such biases. The Agency followed the published literature in designing mitigation strategies to eliminate or at a minimum reduce the potential for hypothetical bias. This includes explicitly designing the survey to maximize the consequentiality of choice experiment questions through direct linkages to proposed EPA regulatory efforts. Moreover, the survey explicitly incorporates elements such as certainty follow-up question to enable mitigation of any remaining hypothetical bias (Ready et al. 2010). Focus group and interview transcripts show that, when asked explicitly, respondents almost universally indicated that their answers to choice questions in the survey instrument would be identical if the same questions were encountered in a binding referendum. This indicates that there is little evidence of hypothetical bias within the draft survey.

A commenter argued that the respondents to the survey would be informed and conditioned based on the information included in the survey and that this would lead to the creation of preferences where none existed before. EPA believes that this result is entirely expected, and is consistent with the academic literature. The Agency emphasizes that the sensitivity of values (and behavior) to information is true of both market and non-market values (Bateman et al. 2002, p. 298), and in no way invalidates the proposed non-market values that would be estimated through the proposed stated preference approaches. It is common practice in such surveys to provide substantial information to survey respondents. The survey information was pretested extensively during the seven focus group sessions and EPA revised or removed informational elements which respondents found confusing or misleading.

Some commenters argued that the survey results will be unreliable because the survey questionnaire contains inaccurate statements and comparisons which overstate the resource impacts from baseline I&E mortality and the regulatory options presented in the survey. EPA recognizes the importance of accurately characterizing resource and regulatory impacts and notes that the survey is based on the best biological and engineering data available. Despite the general observation that I&E impacts are small compared to other effects, I&E has been shown to have measurable impacts on local fish populations and communities. Importantly, increases in fishery sustainability and fish population values presented in the survey instrument are small. Overall, the Agency rejects the claims from commenters, based largely on feedback from focus group participants, that impacts are misrepresented, and emphasizes that the survey is explicitly designed to provide respondents with an understanding of the proposed policies that is as accurate as possible given the best available ecological science.

Commenters also questioned whether survey respondents would have sufficient comprehension of the issues in order to provide meaningful responses to the survey’s valuation questions. EPA agrees that in order to receive meaningful responses, a stated preference survey should provide information to respondents about the hypothetical commodity so that they understand and accept it and can give meaningful answers to the valuation questions. Given the importance of commodity comprehension, EPA devoted considerable attention to comprehension of the hypothetical commodity and related issues (e.g., understanding of payment vehicle, understanding of the ecological scores) during the recent focus groups and cognitive interviews and focus groups conducted for the original version of the Phase III survey instrument in 2005. Focus groups participants showed no difficulty understanding the format of the payment vehicle and that selecting Options A or B would result in increased costs to their household. They also correctly understood this cost as ongoing.

In addition to the comments regarding the payment vehicle, commenters expressed concern regarding respondent comprehension of the ecological scores used in the survey. EPA emphasizes that the reaction and understanding of likely respondents, as opposed to experts, is crucial when testing the communication of ecological information in stated preference surveys. The survey has undergone substantial changes in the way it communicates ecological information based on the results of the seven focus group sessions. For example, in the revised survey, EPA provides more precise information regarding the definition of “young adult fish” on Page 4, which currently states: “After accounting for the number of eggs and larvae that would be expected to survive to adulthood, scientists estimate that the equivalent of about 1.1 billion young adult fish (the equivalent of one year old) are lost each year in Northeast coastal and fresh waters due to cooling water use.” Overall, focus group and interview participants’ statements implied different opinions about the importance of preventing fish losses versus increasing fish population or improving the condition of aquatic ecosystems. Also, there did not appear to be any confusion over the fact that scores of 100 for various attributes are generally unattainable through reductions in CWIS fish losses alone. One commenter argues that the stated preference survey fails to account for effects on a number of non-fish species as well as effects on threatened, endangered, and other protected species. In response to this comment, EPA agrees but notes that focus group respondents suggested that additions to the survey’s length should be avoided. Thus EPA will not be able to use these survey results to represent absolutely complete benefits estimation, but will be able to say that a potentially substantial category of benefits, non-use benefits, has been included. As is common in surveys, EPA has chosen to present policy scenarios in simplified form to facilitate respondent comprehension, and to encourage respondents to focus on the most important policy characteristics related to fish losses. Such simplification of the survey helps to balance the provision of detailed policy information against respondents’ cognitive abilities to consider a large number of attributes simultaneously (Louviere et al. 2000).

Some commenters stated that EPA did not sufficiently emphasize the uncertainty associated with effects and costs of the proposed policies presented within the survey. EPA agrees that there is uncertainty regarding the number of fish killed annually, as well as the effects and costs of the regulatory policies presented within the survey. Additionally, EPA does note uncertainty within the current existing facilities survey. For example, the following statements are included in the current survey version for the Northeast region: “Although scientists can predict the number of fish saved each year, the effect on fish populations is uncertain. This is because scientists do not know the total number of fish in Northeast waters and because many factors – such as cooling water use, fishing, pollution, and water temperature – affect fish populations”, and “Policy costs and effects depend on many factors.” EPA has also included debriefing questions in the survey instrument that are designed to identify individuals whose responses are based on incorrect interpretation of the environmental changes described in the survey, including the uncertainty of the expected changes. EPA points out that debriefing sessions during focus groups and cognitive interviews showed that respondents clearly understood that the ecological changes described in the survey were uncertain. Furthermore, when asked, focus group respondents indicated that they were comfortable making decisions in the presence of uncertainty.

Another commenter questioned various components of EPA proposed sampling methodology, experimental design, and methods for accounting for non-response bias. In response, EPA emphasizes that methods for WTP estimation from ecological choice experiment data—of exactly the type proposed by EPA—are very well established in the published literature. More broadly, when designing the proposed methods, EPA closely followed accepted contemporary methods in the published literature for the estimation of WTP distributions under statistical uncertainty. In the absence of concrete and established alternatives for the choice of sampling weights, EPA has proposed a more conservative approach of reliance on accepted and standard methods from the stated preference literature. EPA believes—following guidance in the literature and its own guidance documents (Arrow et al. 1993; US EPA 2000) for a weighting and extrapolation approach—that established stated preference methods are capable of estimating reliable and accurate welfare measures, if surveys and approaches are appropriately designed. Regarding the potential for non-response bias, EPA has proposed standard approaches for non-response assessments and calibrations in proposing tests and corrections for non-response based on a small number of attitudinal and behavioral questions, combined with demographic characteristics. These currently reflect standard practice within the literature.

Another commenter argued that although the Supreme Court held that Clean Water Act section 316(b) does not prohibit the consideration of costs in relation to benefits of proposed rule it did not find that cost benefit analysis is required (Entergy Corp. v. Riverkeeper, Inc.). EPA has the authority to decide whether to conduct cost benefit analysis of proposed rule options. The commenter, however, recognized that Executive Order 12866, “Regulatory Planning and Review,” requires EPA to estimate potential costs and benefits to society of proposed rule options. In response to the commenter’s claim about utility of cost-benefit analysis in environmental context, EPA notes that cost-benefit analysis is only one tool that can be used to inform policy decisions. EPA is conducting this survey because of Executive Order 12866, “Regulatory Planning and Review,” which requires Federal Agencies to conduct economic impact and cost-benefit analysis for all major rules. Furthermore, cost-benefit analysis requires a comprehensive, estimate of total social benefits, including non-use values. The current information collection would provide valuable information regarding total social benefits of the 316(b) regulation for existing facilities, thus enabling the Agency to perform cost-benefit analysis for the regulation, if it should choose to, without ignoring a potentially important category of benefits (non-use values), and to satisfy the requirements of Executive Order 12866.

For a more detailed discussion of the issues raised by commenters on this ICR, see EPA’s response to public comments on the Federal Register notices published on July 21, 2010 (74 FR 42438) and January 21, 2011 (76 FR 3883). For a discussion of the issues raised by commenters on the previous Phase III survey ICR, see EPA’s response to public comments on the Federal Register notice published on June 9, 2005 (70 FR 33746). For a discussion of issues raised by commenters on the previous Phase III focus group ICR, see EPA's response to public comments on the Federal Register notice published on November 23, 2004 (69 FR 68140).


3(c) Consultations

The Principal Investigator for the stated-preference portion of this effort is Dr. Robert Johnston. Dr. Johnston is assisted by Dr. Elena Besedin, a Senior Economist at Abt Associates Inc. Dr. Erik Helm at the U.S. Environmental Protection Agency serves as the project manager and a contributor to this research.

Robert J. Johnston is Director of the George Perkins Marsh Institute and Professor of Economics at Clark University. He is President-elect of the Northeastern Agricultural and Resource Economics Association (NAREA), on the Program Committee for the Charles Darwin Foundation, the Science Advisory Board for the Communication Partnership for Science and the Sea (COMPASS), and is the Vice President of the Marine Resource Economics Foundation. Professor Johnston has published extensively on the valuation of non-market commodities (goods, services, and resources), benefit cost analysis, and resource management. His recent research emphasizes coordination of ecological and economic models to estimate ecosystem service values, with particular emphasis on the role of aquatic ecological indicators. He has also worked extensively in methodologies for benefit transfer, including the use of meta-analysis. Professor Johnston’s empirical work on non-market valuation and benefit transfer has contributed to numerous benefit cost analyses conducted by federal, state and local government agencies in the US, Canada and elsewhere.

Elena Y. Besedin, a senior economist at Abt Associates Inc., specializes in the economic analysis of environmental policy and regulatory programs. Her work to support EPA has concentrated on analyzing economic benefits from reducing risks to the environment and human health and assessing environmental impacts of regulatory programs for many EPA program offices. She has worked extensively on valuation of non-market benefits associated with environmental improvements of aquatic resources. Dr. Besedin’s empirical work on non-market valuation includes design and implementation of stated and revealed preference studies and benefit transfer methodologies. Her recent work has focused on developing integrated frameworks to value changes in ecosystem services stemming from environmental regulations.

EPA notes that the current survey instrument is built upon an earlier version that was peer reviewed in January 2006. It incorporates recommendations received from the first peer review panel. Because the final product of this study meets the major technical work criteria specified in the Peer Review Handbook (U.S. EPA 2006) the Agency also plans to convene a peer-review panel to review the entire survey process, including the survey instrument, study results, and EPA’s final estimated results for the 316(b) Existing Facilities rulemaking, after the survey is completed.


3(d) Effects of Less Frequent Collection


The survey is a one-time activity. Therefore, this section does not apply.


3(e) General Guidelines


The survey will not violate any of the general guidelines described in 5 CFR 1320.5 or in EPA’s ICR handbook.


3(f) Confidentiality


All responses to the survey will be kept confidential to the extent provided by law. To ensure that the final survey sample includes a representative and diverse population of individuals, the survey questionnaire will elicit basic demographic information, such as age, household size, employment status, and income. However, the detailed survey questionnaire will not ask respondents for personal identifying information, such as names or phone numbers. Prior to taking the survey, respondents will be informed that their responses will be kept confidential to the extent provided by law. The survey data will be made public only after it has been thoroughly vetted to ensure that all potentially identifying information has been removed.


3(g) Sensitive Questions


The survey questionnaire will not include any sensitive questions pertaining to private or personal information, such as sexual behavior or religious beliefs.


4. The Respondents and the Information Requested

4(a) Respondents

The target population for the Stated Preference Survey is all individuals from continental U.S. households who are 18 years of age or older. Survey participants are selected randomly from the U.S. Postal Service Delivery Sequence File (DSF), which covers over 97% of residences in the U.S. The survey households that will be sampled from the DSF include citystyle addresses and PO boxes, and covers singleunit, multiunit, and other types of housing structures. EPA will send a copy of the mail survey to a random stratified sample of 7,628 households. Approximately 2,288 of the adults of the 7,628 adults sent a survey are expected to return a completed survey.

For the selection of households, the population of households in the 48 states and the District of Columbia will be stratified by four study regions. There are a total of seven study regions for purposes of evaluating the 316(b) existing facilities rule benefits. For the purposes of the stated preference survey implementation, EPA uses four geographic regions: Northeast, Southeast, Inland, and Pacific. The Northeast region includes the North Atlantic and Mid Atlantic regions, the Southeast region includes the South Atlantic and Gulf of Mexico regions, the Pacific region includes states on the Pacific coast, and the Inland region includes all non-coastal states.

A sample of 2,000 households would complete a version of the survey which specifically addresses policies within their region. The total sample completing regional survey versions is allocated to each region in proportion to the total number households in that region with the restriction that at least 288 persons respond in each region. This is the number required to estimate the main effects and interactions under an experimental design model. The total sample size for each region is much larger then the minimum sample size required for model estimation for all but one region (Pacific). An additional sample of 288 households will receive a national survey version which addresses policies at the national scale. This sample would be distributed among the study regions based on the percentage of regional survey sample (as shown in Table A1) to ensure that respondents to the national survey version are distributed throughout the continental U.S. Part B of this document provides detail on sampling methodology.

Table A1 shows the stratification design for the geographic regions covered by the sample for this survey. More detail on planned sampling methods and the statistical design of the survey can be found in Part B of this supporting statement.


Table A1: Geographic Stratification Design

Region

States Included

Sample

Sizea

Percentage of Sample

Northeast

CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT

417

21%

Southeast

AL, FL, GA, LA, MS, NC, SC, TX, VA

562

28%

Inland

AR, AZ, CO, ID, IA, IL, IN, KS, KY, MI, MN, MO, MT, NM, OK, ND, NE, NV, OH,TN, SD, UT, WI, WV, WY

732

37%

Pacific

CA, OR, WA

289

14%

Total for Regional Surveys Versions

U.S. (excluding AK and HI)

2,000

100%

National Survey Version

U.S. (excluding AK and HI)

288

-

a Sample sizes presented in this table include only the 2,288 individuals returning completed mail surveys.



4(b) Information Requested

(I) Data items, including recordkeeping requirements

Households randomly selected from the U.S. Postal Service DSF database will be mailed a copy of the survey. The full text of the regional version of the mail survey for the Northeast region is provided in Attachment 1 and the full text of the national version of the mail survey is provided in Attachment 2. EPA revised the survey based on the findings of a series of seven focus groups conducted as part of survey instrument development (OMB control # 2090-0028). Additional information regarding focus group implementation is provided in Section 5(b). EPA has determined that all questions in the survey are necessary to achieve the goal of this information collection, i.e., to collect data that can be used to support an analysis of the total benefits of the 316(b) regulation.

The following is an outline of the major sections of the survey.

Relative Importance of Issues Associated with Industrial Cooling Water. The first survey question asks respondents to rate the general importance of (a) preventing the loss of fish caught by humans, (b) preventing the loss of fish not caught by humans, (c) maintaining ecological health in rivers, lakes, and bays, (d) keeping the cost of goods and services low, (e) making sure there is enough government regulation on industry, and (f) making sure there is not too much government regulation on industry. This question is designed to elicit the respondent’s general preferences for regulation, reductions in fish losses, and ecological health. It also places respondents in the mindset where they are cognizant of the range of issues associated with the use of cooling water by industrial facilities.

Concern for Policy Issues. The second survey question asks respondents to rate the general importance of protecting aquatic ecosystems compared to other issues that the government might address. This question is designed to remind respondents that there are other issues (such as public safety, education, and health) to which the government could direct funds, rather than spending these funds to prevent fish losses. Such questions are commonly used in introductory sections of stated preference surveys (e.g., Mitchell and Carson 1984), in order to place respondents in a mindset in which they are cognizant that there are substitute goods and policy issues to which they might direct their scarce household budgets.

Relative Importance of Effects. Question 3 asks the respondent to rate the importance of each of the effects captured by the five scores: (a) commercial fish populations (b) fish populations (for all fish), (c) fish saved, (d) condition of aquatic ecosystems, and (e) cost to my household. This question is designed to promote understanding of the scores by placing respondents in a mindset where they consider the meaning of each score and consider their general preferences for effects prior to considering specific policy options. The question also promotes understanding by telling the respondent that they can return to previous pages for reminders of what the scores mean.

Voting for Regulations to Prevent Fish Losses in the Respondent’s Region. Questions 4, 5, and 6 are “choice experiment” or “choice modeling” questions (Adamowicz et al. 1998; Bennett and Blamey 2001), and ask respondents to choose how they would vote, if presented with two hypothetical regulatory options (and a third “status quo” choice to reject both options) for waters within the respondents’ region (e.g., Northeast waters). Each of the multi-attribute options is characterized by (a) commercial fish populations (in 3-5 years) (b) fish populations (all fish; in 3-5 years), (c) fish saved per year (out of [total] fish lost in water intakes), (d) condition of aquatic ecosystems (in 3-5 years), and (e) an unavoidable cost of living increase for the respondent’s household. Following standard choice experiment methods, respondents choose the regulatory options that they prefer, based on their preferences. Respondents always have the option to vote for neither option—providing the status quo option is necessary for appropriate welfare estimation (Adamowicz et al. 1998). Advantages of choice experiments, and the many examples of the use of such approaches in the literature, are discussed in later sections of this ICR. Following standard approaches (Opaluch et al. 1993, 1999; Johnston et al. 2002a; 2002b, 2003b), respondents are instructed to answer each of the three choice questions independently, and not to “add up or compare programs across different pages.” This is included to avoid biases associated with sequence aggregation effects (Mitchell and Carson 1989). EPA will also vary the order in which the policy option attributes are presented across respondents, such as presenting household cost first or presenting fish saved per year lower in the list of choice question attributes. While complete randomization is impractical for the mail survey, the change in order would allow for a potential test of ordering effects.

Reasons for Voting “No Policy”. Question 7 is a follow-up to the prior voting questions, and asks respondents to identify the primary reason for voting no, if they always voted for “no policy” in questions 4-6. It is designed to identify respondents whose “no policy” responses are based on their budget constraint, respondents who do not consider fish losses important enough to vote for a policy, or respondents who ignored information presented in the survey and answered questions based on their general convictions and principles. In an electronic survey format, respondents who voted for a policy would not see this question, potentially reducing burden.

Respondent Certainty and Reasons for Voting. Questions 8 and 9 are follow-up questions to the prior voting. Question 8 assesses the certainty that respondents feel in their choice experiment responses, following methods of Champ et al. (2004; 2009), Akter et al. (2009), Kobayashi et al. (2010), and others. It is designed to identify respondents whose responses are based on incorrect interpretation of the resource changes and the uncertainty of ecological outcomes from policy options. EPA will evaluate respondent understanding when completing stated preference surveys using methods and approaches discussed in Boyle (2003), Kaplowitz et al. (2004), Bateman et al. (2002), Powe (2007) and others. Question 9 asks respondents to rate the effect of factors on their choices, and why they voted for or against the regulatory programs. Responses to such questions have been used in the literature to successfully control for hypothetical bias.

Recreational Experience. Questions 10 asks respondents how often they participate in specific types of water-related recreational activities within the last year. This question can be used to identify non-users of the fishery resource—thereby allowing the estimation of non-user values for I&E mortality reductions.2 Examples of this approach to estimation of non-user values are provided by Johnston et al. (2005a), Whitehead et al. (1995), Croke et al. (1986), Olsen et al. (1991), Cronin (1982), Whitehead and Groothuis (1992), and Mitchell and Carson (1981).

Fish Consumption. Question 11 asks responds whether they consume commercially and recreationally caught seafood. This information will be used to identify respondents that are potentially affected by changes in the commercial and recreational fisheries.

Demographics. Questions 12-22 ask respondents to provide basic demographic information, including age, gender, highest level of education, household size, household composition, zip code, employment status, and household income. This information will be used in the analysis of survey results, as well as in the non-response analysis.

Comments. The survey offers respondents a chance to comment on the survey.


The Agency will modify the survey instrument for each region relative to the regional survey shown in Attachment 1 for the Northeast region as follows:

  • Cover – The text on the cover reads “A Survey of Northeast Residents (CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT)”. “Northeast Residents” and the list of included states will be changed to match the respondent’s region.

  • Cover – The photo on the cover is of a forage species (silversides) in found in the Northeast region. If that species isn't found in other regions, EPA will replace the cover photo with a similar substitute photo for a forage species relevant to that region.

  • Page 1 – The survey states that it “asks for your opinions regarding policies that would affect fish and habitat in the Northeast U.S.” “Northeast U.S.” will be replaced with the respondent’s region or simply, “the U.S.” for the national version.

  • Page 1 – Includes the statement that “Northeast fresh and salt waters support billions of fish.” “Northeast” will be replaced with the respondent’s region or “the U.S.” for the national version and “salt water” will be removed for the Inland region.

  • Page 2 - The survey states that “Cooling water use affects fresh and salt waters throughout the Northeast US, but 93% of all fish losses are in coastal bays, estuaries, and tidal rivers.” For the Southeast and Pacific regions and the national version of the survey, which include both salt water and freshwater facilities, reference to the Northeast region would be replaced with the name of the respondent’s region or “the U.S.” for the national version of the survey. The percentage of fish losses occurring in coastal bays, estuaries, and tidal rivers will also be changed to reflect regional or national losses. For the Inland region which only includes freshwater facilities, reference to salt water will be removed.

  • Page 2 – A map of the Northeast region and facility locations is presented. A comparable map will be produced for each region or the U.S., and map included in the survey will correspond to the respondent’s region or the U.S.

  • Page 4 - This page provides the range of fish saved under different policy options. This range will be updated to reflect totals for the respondent’s region or national totals.

  • Page 5 - The total losses (1.1 billion) included within the survey correspond to losses from facilities in Northeast waters. This total would be replaced with the total losses for the respondent’s region or total losses for the U.S. for the national version. The pie charts will be updated based on the regional (national) total and percent regional improvements under policy options.

  • Page 7 - The table text describing each score will be changed to include the current scores for the respondent’s region.

  • Page 10 – The included figure illustrates the location of “Facilities Using Cooling Water Intake”. The figure will be replaced with a map showing facility locations within the respondent’s region or in the U.S. for the national version.

  • Pages 11-14 – Regional references within the table headings in Questions 4, 5, and 6 (e.g., “Policy Effect NE Waters””) will be modified to refer to the policy effects and options for the respondent’s region. The “Fish Saved per Year” score includes a note reminding the respondent of total fish lost within the region due to I&E (e.g., 1.1 billion); this number will be changed to reflect total losses within the respondent’s region or total national losses. The values describing the current situation and Options A and B within experiment questions 4, 5, and 6 will also vary across regions.

(II) Respondent activities


EPA expects individuals to engage in the following activities during their participation in the valuation survey:

  • Review the background information provided in the beginning of the survey document.

  • Complete the survey questionnaire and return it by mail.

A typical subject participating in the mail survey is expected to take 30 minutes to complete the survey. These estimates are derived from focus groups and cognitive interviews in which respondents were asked to complete a survey of similar length and detail to the current survey.


5. The Information Collected - Agency Activities, Collection Methodology, and Information Management

5(a) Agency Activities


The survey is being developed, conducted, and analyzed by Abt Associates Inc. and is funded by EPA contract No. EP-C-07-023 which provides funds for the purpose of analyzing the economic benefits of the proposed rule for existing facilities subject to the section 316(b) regulation. Agency activities associated with the survey consist of the following:

  • Developing the survey questionnaire and sampling design.

  • Randomly selecting survey participants from the U.S. Postal Service DSF database.

  • Printing of survey.

  • Mailing of preview letter to notify the household that it has been selected.

  • Mailing of surveys.

  • Mailing of postcard reminders.

  • Resending the survey to households not responding to the first survey mailing.

  • Mailing the follow-up letter reminding households to complete the second survey mailing.

  • Conducting a follow-up study of non-respondents to the mail survey using a combination of telephone and priority mailing to reach nonrespondents.

  • Data entry and cleaning.

  • Analyzing survey results.

  • Analyzing the non-response study results

  • If necessary, EPA will use results of the non-response study to adjust weights of respondents to account for non-response and minimize the bias.

Although not covered under this ICR, EPA will primarily use the survey results to estimate the social value of changes in I&E mortality losses of forage, recreational, and commercial species of fish, as part of the Agency’s analysis of the benefits of the 316(b) rule for existing facilities. If reliable environmental data were to be developed for population changes and other ecosystem impacts, social values for these benefit types may also be assessed for the rulemaking using survey results.


5(b) Collection Methodology and Information Management


To pretest the survey questionnaire, EPA conducted a series of seven focus groups, including one using cognitive interview methodologies under a different ICR (OMB control # 2090-0028). Focus groups provided valuable feedback which allowed EPA to iteratively edit and refine the questionnaire, and eliminate or improve imprecise, confusing, and redundant questions. Focus groups and cognitive interviews were conducted following standard approaches in the literature, as outlined by Desvousges et al. (1984), Desvousges and Smith (1988), Johnston et al. (1995), Schkade and Payne (1994), Kaplowicz et al. (2004), and Opaluch et al. (1993).

EPA plans to implement the proposed survey as a mail choice experiment questionnaire. First, EPA will use the U.S. Postal Service DSF database, to identify households which will receive the mail questionnaire. Prior to mailing the survey, EPA will send the selected households a preview letter notifying them that they have been selected to participate in the surveyand briefly describing the purpose of this study. The mail survey will be mailed one to two weeks after the preview letter accompanied by a cover letter explaining the purpose of the survey. The preview and cover letters are included as Attachments 6 and 7, respectively.

EPA will take multiple steps to promote response. All households will receive a reminder postcard approximately one week after the initial questionnaire mailing. The postcard reminder is included as Attachment 8. Approximately three weeks after the first round of survey mailing, all households that have not responded will receive a second copy of the questionnaire with a revised cover letter (see Attachment 9). A week after the second survey is mailed, a letter will be sent to remind households to complete the survey. The letter reminder is included as Attachment 10. Based on this approach to mail data collection, it is anticipated that approximately 30 percent of the selected households will return the completed mail survey. Since the desired number of completed surveys is 2,288, it will be necessary to mail surveys to 7,628 households (Dillman 2000).

Data quality will be monitored by checking submitted surveys for completeness and consistency, and by asking respondents to assess their own responses to the survey. Question 8 asks respondents to rate their understanding of the survey and their confidence in their responses. Questions 7 and 9 are designed to assess the presence or absence of potential response biases by asking respondents to indicate their reasoning and rate the influence of various factors on their responses to the choice experiment questions. Responses to the survey will be stored in an electronic database. This database will be used to generate a data set for a regression model of total values for reductions in fish I&E mortality by section 316(b) existing facilities.

To protect the confidentiality of survey respondents, the survey data will be released only after it has been thoroughly vetted to ensure that all potentially identifying information has been removed.


5(c) Small Entity Flexibility


This survey will be administered to individuals, not businesses. Thus, no small entities will be affected by this information collection.


5(d) Collection Schedule


The schedule for implementation of the survey will is shown in Table A2. The Northeast survey version will serve as a pilot study implemented ahead of other survey versions. Reponses and preliminary findings to the Northeast survey will be used to inform EPA regarding the response rates and the quality of survey data. EPA will evaluate Northeast responses and determine whether any changes to the survey instrument or implementation approach are needed.


Table A2: Schedule for Survey Implementation

Activity

Duration of Each Activity

Northeast Version

All Other Survey Versions

Printing of questionnaires

Weeks 1 to 2

Weeks 8 to 9

Mailing of Preview Letters

Week 3

Week 10

Mailing of survey

Week 4

Week 11

Postcard reminder (one week after initial survey mailing)

Week 5

Week 12

Initial Data Entry and Pilot Tests

Week 6

-

Mailing of 2nd survey to non-respondents

Week 8

Week 13

Letter reminder (one week after 2nd survey mailing)

Week 9

Week 14

Telephone non-response interviews

Weeks 11 to 13

Week 16 to 18

Ship priority mail non-response survey

Week 11

Week 16

Data entry

Weeks 4 to 14

Week 11 to 19

Cleaning of data file

Week 15

Week 20

Delivery of data

Week 16

Week 21



6. Estimating Respondent Burden and Cost of Collection

6(a) Estimating Respondent Burden


Subjects who participate in the survey and follow-up interviews will expend time on several activities. Based on the administration of the mail survey to 7,628 households, the national burden estimate for all respondents is 1,144 hours assuming that 2,288 respondents will complete and return the survey. Based on pretests conducted in focus groups, EPA estimates that on average each respondent mailed the survey will spend 30 minutes reviewing the introductory materials and completing the survey questionnaire. Thus, the average burden per respondent is 30 minutes (0.5 hours) for these 2,288 respondents to the mail survey.

EPA plans to conduct a non-response follow-up study that uses a short questionnaire and a combination of telephone and priority mailing. The short version of the questionnaire is included in Attachment 13. The short questionnaire will be administered by phone to 200 nonrespondents and by priority mail to 400 nonrespondents. EPA estimates that telephone non-response interviews will take 5 minutes (0.08 hours) per interview for each of the 200 households completing interviews. EPA estimates that each of the 400 households completing the mail version of the short questionnaire will take 5 minutes to do so (0.08 hours). Thus the average burden per respondent is 5 minutes (0.08 hours) for these 600 total participants in the non-response survey.

These burden estimates reflect a one-time expenditure in a single year.


6(b) Estimating Respondent Costs


According to the Bureau of Labor Statistics, the average hourly wage for private sector workers in the United States is $20.42 (2009$) (U.S. Department of Labor 2009). Assuming an average per-respondent burden of 0.5 hours for individuals mailed the survey and an average hourly wage of $20.42, the average cost per respondent is $10.21. Of the 7,628 individuals receiving the mail survey, 2,288 are expected to return their completed survey. The total cost for all individuals that return surveys would be $23,360.

Assuming an average per-respondent burden of 0.5 hours for each of the 600 total participants in the non-response study and an average hourly wage of $20.42, the average cost per screening participant is $1.70. Therefore the total cost to participants in the non-response study phase would be $1,021.

EPA does not anticipate any capital or operation and maintenance costs for respondents.


6(c) Estimating Agency Burden and Costs

OMB approved implementation of the Northeast region of the stated preference survey as a pilot study conducted in advance of other survey versions. EPA has completed fielding both the Northeast mail survey and non-response follow-up study. A preliminary model has been estimated for the Northeast region and weighting adjustments are being assessed based on the results of the non-response study. The remaining survey versions (Inland, Southeast, Pacific, and National) are still being fielded. Agency and contractor burden has been updated within this ICR based on the response rates observed during the Northeast pilot.

For the main mail survey in the Northeast region, EPA received a total of 399 completed surveys for a 30% response rate equal to the rate assumed during development of the sampling frame. EPA administered the non-response survey via Priority Mail and telephone. The initial target sample sizes were 73 and 36 for the Priority Mail and telephone subsamples, respectively, for 109 total non-response contacts. For the Priority Mail subsample, EPA randomly selected 146 non-responding households based on an anticipated 50% response rate (73/0.5). The anticipated response rate was based on prior studies that administered surveys via Priority Mail. EPA actually received 48 completes from the Priority Mail sample giving a 33% response rate (48/146). Because the Priority Mail response was lower than expected, the target number of telephone completes was increased to obtain the desired number of responses. EPA randomly selected 331 households for the telephone survey from the subset of households with matched telephone numbers that did not complete the main mail survey or Priority Mail questionnaire. Fifty-one of the households had been previously sent, but did not return a completed Priority Mail questionnaire. The other 280 households (330-51) were sent a preview letter including a $2 incentive one week before the first telephone attempt. The telephone survey was divided into replicates to potentially cut down on cost if the required number of completes was achieved early. EPA made up to 12 attempts to achieve telephone contacts with the selected households. EPA stopped telephone calls after reaching the 63 completes within the 331 selected households, for a response rate of 19%. Revised estimates of agency and contractor burden for the non-response study were calculated by adjusting previous estimates upward based on the ratio of assumed response rates to response rates observed for the Northeast region (50%/33% for Priority Mail and 80%/19% for telephone). Respondent burden was unchanged because the target completes for the Priority Mail and telephone non-response samples were unchanged and estimated respondent burden is limited to time spent completing the survey.

This project will be undertaken by Abt Associates Inc. with funding of $355,969 from EPA contract EP-C-07-023, which provides funds for the purpose of analyzing the economic benefits of the proposed rule for existing facilities subject to the section 316(b) regulation. Abt Associates Inc. staff is expected to spend 5,952 hours pre-testing the survey questionnaire and sampling methodology, conducting the mail survey, conducting the non-response survey, and tabulating and analyzing the survey results. The cost of this contractor time is $$255,438. In addition to the effort expended by EPA’s contractors, EPA staff is expected to spend 320 hours managing and reviewing this project and contributing to the analysis at a cost of $31,000. Agency and contractor burden is 6,272 hours, with a total cost of $286,438 excluding the costs of survey printing and mailing. Mailing and printing of the survey is expected to take 133 hours and cost $100,531. Thus, the total Agency and contractor burden would be 6,404 hours and would cost $386,969.


6(d) Respondent Universe and Total Burden Costs


EPA expects the total cost for survey respondents to be $24,381 (2009$), based on a total burden estimate of 1,194 hours and an hourly wage of $20.42.


6(e) Bottom Line Burden Hours and Costs


The following table presents EPA’s estimate of the total burden and costs of this information collection:


Table A3: Total Estimated Bottom Line Burden and Cost Summary

Affected Individuals

Northeast Region

Other Survey Regions

Total – All Regions

Burden (hours)

Cost (2009$)

Burden (hours)

Cost (2009$)

Burden (hours)

Cost (2009$)

Mail Survey Respondents

209

$4,257

936

$19,103

1,144

$23,360

Non-response Survey Participants

9

$186

41

$835

50

$1,021

Total for Survey Respondents

218

$4,444

976

$19,937

1,194

$24,381

EPA Staff

58

$5,650

262

$25,350

320

$31,000

Survey Printing and Mailing

24

$18,322

109

$82,209

133

$100,531

EPA's Contractors for the Mail Survey

211

$14,034

944

$62,966

1,155

$77,000

Priority Mail Non-Response Subsample

92

$9,512

410

$42,610

502

$52,122

Telephone Non-Response Subsample

773

$22,737

3,522

$103,294

4,295

$126,316

Total Burden and Cost

1,375

$74,983

6,223

$336,367

7,598

$411,350


6(f) Reasons for Change in Burden


The survey is a one-time data collection activity.


6(g) Burden Statement


EPA estimates that the public reporting and record keeping burden associated with the mail survey will average 0.5 hours per respondent (i.e., a total of 1,144 hours of burden divided among 2,288 survey respondents). Households included in the non-response study are expected to average 0.08 hours per screening interview participant (i.e., a total of 50 hours of burden divided among 600 non-response study participants). This results in a total burden estimate of 1,194 hours including both the mail survey and non-response study. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations are listed in 40 CFR part 9 and 48 CFR chapter 15.

To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID No. EPA-HQ-OW-2010-0595, which is available for online viewing at www.regulations.gov, or in person viewing at the Office of Water Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Ave., NW, Washington, DC. The EPA/DC Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Reading Room is 202-566-1744, and the telephone number for the Office of Water Docket is 202-566-1752.

Use www.regulations.gov to obtain a copy of the draft collection of information, submit or view public comments, access the index listing of the contents of the docket, and to access those documents in the public docket that are available electronically. Once in the system, select “search,” then key in the docket ID number, EPA-HQ-OW-2010-0595.

PART B OF THE SUPPORTING STATEMENT


1. Survey Objectives, Key Variables, and Other Preliminaries

1(a) Survey Objectives


The overall goal of this survey is to explore how public values (including non-use values) for fish and aquatic organisms are affected by I&E mortality at cooling water intake structures (CWIS) located at existing 316(b) facilities, as reflected in individuals’ willingness to pay for programs that would prevent such losses. EPA has designed the survey to provide data to support the following specific objectives:


  • To estimate the total values, including non-use values, that individuals place on preventing losses of fish and other aquatic organisms caused by CWIS at existing 316(b) facilities.

  • To understand how much individuals value preventing fish losses, increasing fish populations, improvements in aquatic ecosystems, and increasing commercial and recreational catch rates.

  • To understand how such values depend on the current baseline level of fish populations and fish losses, the scope of the change in those measures, and the certainty level of the predictions.

  • To understand how such values vary with respect to individuals’ economic and demographic characteristics.


Understanding total public values for fish resources lost to I&E mortality is necessary to determine the full range of benefits associated with reductions in impingement and entrainment losses at existing 316(b) facilities. Because non-use values may be substantial, failure to recognize such values may lead to improper inferences regarding policy benefits (Freeman 2003).


1(b) Key Variables


The key questions in the survey ask respondents whether or not they would vote for policies that would increase their cost of living, in exchange for specified changes in: (a) I&E mortality losses of fish, (b) commercial fish sustainability, (c) long-term fish populations, and (d) condition of aquatic ecosystems3. More specifically, the choice experiment framework allows respondents to view pairs of multi-attribute policies associated with the reduction of I&E mortality losses. Respondents are asked to choose the program that they would prefer, or to choose to reject both policies. This follows well-established choice experiment methodology and format (Adamowicz et al. 1998; Louviere et al. 2000; Bennett and Blamey 2001; Bateman et al. 2002). Important variables in the analysis of the choice questions are how the respondent votes, the amount of the cost of living increase, the number of fish losses that are prevented, the sustainability of commercial fishing, the change in fish populations, and the condition of aquatic ecosystems. Other important variables include whether or not the respondent is a user of the affected aquatic resources, household income, and other respondent demographics.


1(c) Statistical Approach


EPA believes that a statistical survey approach is appropriate. A census approach is impractical because contacting all households in the U.S. would require an enormous expense. On the other hand, an anecdotal approach is not sufficiently rigorous to provide a useful estimate of the total value of fish loss reductions for the 316(b) case. Thus, a statistical survey is the most reasonable approach to satisfy EPA’s analytic needs for the 316(b) regulation benefit analysis.

EPA has retained Abt Associates Inc. (55 Wheeler Street, Cambridge, MA 02138) as a contractor to assist in questionnaire design, sampling design, and analysis of the survey results.


1(d) Feasibility


The survey instrument was repeatedly pre-tested during a series of seven focus groups (conducted under a different ICR with OMB control # 2090-0028), in addition to the twelve focus groups conducted for the Phase III survey (EPA-HQ-OW-2004-0020), and it will be subject to peer review by reviewers in academia and government, so EPA does not anticipate that respondents will have difficulty interpreting or responding to any of the survey questions. Additionally, since the survey will be administered as a mail survey, it will be easily accessible to respondents. Thus, EPA believes that respondents will not face any obstacles in completing the survey, and that the survey will produce useful results. EPA has dedicated sufficient funding (under EPA contracts No. EP-C-07-23) to design and implement the survey. Given the timetable outlined in Section A.5(d) of this document, the survey results will be available for timely use in the final benefits analysis for the 316(b) existing facilities rule.


2. Survey Design

2(a) Target Population and Coverage


The target population for this survey includes individuals from continental U.S. households who are 18 years of age or older. The sample will be chosen to reflect the demographic characteristics of the general U.S. population.


2(b) Sampling Design

(I) Sampling Frames


The sampling frame for this survey is the panel of individuals selected from U.S. Postal Service Digital Sequence File (DSF) to receive a mail survey. The overall sampling frame from which these individuals would be selected is the set of all individuals in continental U.S. households who are 18 years of age or older and who have a listed address. The DSF includes city-style addresses and P.O. boxes, and covers single-unite, multi-unit, and other types of housing structures with known business excluded. In total the DSF covers 97% of residences in the U.S.

For discussion of techniques that EPA will use to minimize non-response and other non-sampling errors in the survey sample, refer to Section 2(b)(II), below.


(II) Sample Sizes


The intended sample size for the survey is 2,288 households including only households providing completed mail surveys. This sample size was chosen to provide statistically robust regression results while minimizing the cost and burden of the survey. Given this sample size, the level of precision (see section 2(c)) achieved by the analysis will be more than adequate to meet the analytic needs of the benefits analysis for the 316(b) regulation. For further discussion of the level of precision required by this analysis, see Section 2(c)(I) below.


(III) Stratification Variables


The survey sample will be selected using a stratified selection process. For the selection of households, the population of households in the contiguous 48 states and the District of Columbia will be stratified by the geographic boundaries of four study regions: Northeast, Southeast, Inland, and Pacific. As described previously, the Northeast region includes the North Atlantic and Mid Atlantic 316(b) benefits regions, the Southeast region includes the South Atlantic and Gulf of Mexico 316(b) benefits regions, the Pacific region includes states on the Pacific coast, and the Inland region includes all non-coastal states. The sample is allocated to each region in proportion to the total number of households in that region, with at least 288 completed surveys in each region. This is the number required to estimate the main effects and interactions under an experimental design model as described in Section 4(a) of Part A. To accommodate this requirement the sample sizes in other regions will be slightly reduced. A sample of 288 households completing the national survey version would be distributed among the study regions based on the percentage of regional survey sample (as shown in Table A1) to ensure that respondents to the national survey version are distributed across the continental U.S.


(IV) Sampling Method


Using the stratification design discussed above, respondents will be randomly selected from the U.S. Postal Service DSF database. If it is assumed that 30% of the selected households will actually return a completed mail survey (completion rate) then 7,628 questionnaires will need to be mailed to households.4 First, a sample of 7,628 addresses will be randomly selected from the DSF database. in . Then, a copy of the mail survey will be mailed to the selected addresses. For obtaining population-based estimates of various parameters, each responding household will be assigned a sampling weight. This weight combines a base sampling weight which is the inverse of the probability of selection of the household and then an adjustment for non-response. The weights will be used to produce estimates that are generalizable to the population from which the sample was selected (e.g., percent of population participating in water-based recreation such as fishing and shellfishing). Proportional allocation of the sample to regions ensures an equal probability sample. To estimate total WTP for the quantified environmental benefits of the 316(b) existing facilities rulemaking data will be analyzed statistically using a standard random utility model framework.


(V) Multi-Stage Sampling


Multi-stage sampling will not be necessary for this survey.


2(c) Precision Requirements

(I) Precision Targets


Table B1, below, shows the target samples sizes for both the U.S. (excluding Alaska and Hawaii) and each of the four EPA study regions. At the regional level, a sample of 2,000 households (completed surveys) will provide estimates of population percentages with a margin of error ranging from 3.6 to 5.8 percentage points at the 95% confidence level. A sample of 288 household for the national survey version (completed surveys) will provide an estimate of population percentages with a margin of error no greater than 5.8 percentage points at the 95% confidence level.


Table B1: Number of Households and Household Sample for Each EPA Study Region

Region

Household Population

Household Sample

Northeast

23,281,296

417

Southeast

31,378,122

562

Inland

40,852,983

732

Pacific

16,158,206

288

Total for Regional Survey Versions

111,670,607

2,000

National Survey Version

111,670,607

288

Source: The number of households in each region was obtained based on the estimated population size and average household size from the 2006-2008 American Community Survey (ACS).


(II) Non-Sampling Errors


One issue that may be encountered in stated preference surveys is the problem of protest responses. Protest responses are responses from individuals who reject the survey format or question design, even though they may value the resources being considered (Mitchell and Carson 1989). For example, some respondents may feel that any amount of I&E is unacceptable, and choose not to respond to the survey. To deal with this issue, EPA has included several questions, including an open-ended comments section, to help identify protest responses. The use of such methods to identify protest responses is well-established in the literature (Bateman et al. 2002). Moreover, many researchers (e.g., Bateman et al. 2002) suggest that a choice experiment format, such as that proposed here, may ameliorate such responses (over the earlier contingent valuation format).

A different type of non-sampling error is non-response bias. Non-response rates in this survey are affected by non-response among households sent the mail survey. EPA has designed the survey instrument to maximize the response rate. EPA will also follow Dillman’s mail survey approach (Dillman et al. 2008) to minimize the potential for non-response bias in the current survey:

  1. Preview letter: respondents will receive a preview letter that notifies the household that it has been selected and briefly describes the survey;

  2. First survey mailing: the survey booklet will be sent to selected households 1-2 weeks after the preview letter;

  3. Postcard reminder: a postcard reminder will be sent 1 week after the1st survey mailing

  4. Second survey mailing: the survey booklet will be sent to those households who did not respond to the first mailing 3 weeks after the first survey mailing

  5. Second reminder: a follow up letter (Dillman et al. 2008) will be sent 1 week after the second survey mailing

  6. Response rates will be tracked on a daily basis. If any unexpected declines are encountered, corrective action can immediately be undertaken.

  7. EPA will undertake non-response bias analysis as detailed in the following section.

If necessary, EPA will use appropriate weighting or other statistical adjustment to correct the bias because of non-response.


Non-response Interviews

To determine whether there is any evidence of significant non-response bias in the completed sample, EPA will conduct a non-response follow-up study to identify potential differences in WTP estimates associated with respondents to the mail survey and those that did not return the questionnaire.

EPA has used a set of key attitudinal and socio-demographic variables that are thought to be associated with WTP for reducing fish mortality from cooling water intake structures to develop a short questionnaire that will take respondents 5 minutes to complete. The short questionnaire will be implemented using a dual frame of telephone and priority mailing.

  • To select the priority mailing subsample, the entire sample of mail addresses will be matched against the directory listed landline telephone numbers. After the matching, the nonresponding mail addresses will be divided into two strata. The first stratum will consist of those nonresponding addresses with matched telephone numbers. The second stratum will consist of nonresponding mail addresses that do not have matched telephone numbers, The total subsample that we plan to select will be allocated to the two strata in proportion to the number of nonrespondents in each group. Households selected in each stratum will be sent a questionnaire by priority mailing. The mailing will include $2 in cash as an unconditional incentive for completion of the short questionnaire to encourage a high response rate.

  • The telephone subsample will be selected from the first stratum with matched telephone numbers. This subsample will include those that did not respond to priority mailing and those that were not sent priority mailing. This subsample will be contacted by telephone. . Once contact is achieved with a household by telephone for this subsample, one adult is selected in each household as the designated respondent. If there is more than one eligible respondent per household, then a random selection is done for the individual with the most recent/next birthday. Selected households will be sent a letter prior to calling which will include $2 in cash as an unconditional incentive for participation in the telephone interview to promote a high response rate.

A second subsample from stratum 2 (without matched telephone numbers) consisting of those who did not respond to priority mailing and those did not receive priority mailing will again be contacted by priority mailing. This will ensure adequate representation to those whose addresses do not match landline telephone numbers. Keeping in view that the priority mail subsample cover households both with and without landlines and the telephone subsample covers those only with landlines a total subsample of 600 households is recommended with 400 from priority mailing and 200 from the telephone subsample The subsample of 600 households from the non-respondents permits EPA to reject the hypothesis of no difference in population percentages between respondents and non-respondents with 80 percent power when there is a difference of 12 percentage points according to a two-sided statistical test. Since the estimates for the non-respondents are based on different sampling weights, EPA may be able to detect differences of 13 or 14 percentage points. Table B2 illustrates the distribution of the priority mail and telephone subsamples across survey regions.

Table B2: Number of Non-responding Households in the Priority Mail and Telephone Subsamples

Region

Number of Non-Respondents

Number in Priority Mail Subsample

Number in Telephone Subsample

Number in Total Subsample (completes)

Northeast

973

73

36

109

Southeast

1,312

98

49

147

Inland

1,708

128

64

192

Pacific

675

51

25

76

Total for Regional Survey Versions

4,668

350

175

524

National Survey Version

672

50

25

76

Total – All Survey Versions

5,340

400

200

600

Source: The number of households in each region was obtained based on the estimated population size and average household size from the 2006-2008 American Community Survey (ACS).



EPA will use the data of the non-response questionnaire to compare mail survey respondents and non-respondents. The items of information collected during the short questionnaire will help determine the type of person that is likely to not respond to the survey and may help in forming weighting classes for adjusting weights of respondents to account for non-response and minimize the bias because of non-response. The cover letter and questionnaire used for the priority mail subsample are included as Attachments 12 and 13, respectively. The cover letter and script for the telephone subsample are included as Attachments 11 and 14.


2(d) Questionnaire Design


The information requested by the survey is discussed in Section 4(b)(I) of Part A of the supporting statement. The full text of the draft questionnaire for the Northeast region is provided in Attachment 1 and the full text of the draft questionnaire for the national survey version is provided in Attachment 2.

The following bullets discuss EPA’s reasons for including the questions in the survey:

  • Relative Importance of Issues Associated with Industrial Cooling Water. EPA included this section to prepare respondents to answer the stated preference questions by motivating respondents to consider the relative importance of key issues associated with the use of cooling water by industrial facilities.

  • Concern for Policy Issues. EPA included this section to prepare respondents to answer the stated preference questions by motivating respondents to think about the relative importance of different policy issues.

  • Relative Importance of Effects. This section was included to promote understanding of the metrics included in the stated preference questions by asking them to consider their relative importance prior to evaluating policy options and by encouraging respondents to re-read previous pages for reminders if necessary.

  • Voting for Regulations to Prevent Fish Losses in the Respondent’s Region (or Nationally). The questions in this section are the key part of the survey. Respondents’ choices when presented with specific fish-related resource changes within their region and household cost increases are the main data that allow estimation of willingness-to-pay. The questions are presented in a choice experiment (A, B, or neither) format because this is an elicitation format that has been successfully used by a number of previous valuation studies (Adamowicz et al. 1998; Bateman et al. 2002; Bennett and Blamey 2001; Louviere et al. 2000; Johnston et al. 2002a, 2005; Opaluch et al. 1993). Furthermore, many focus group participants indicated that they have some previous experience making choices within a framework in which they are asked to vote for one of a series of options, and are comfortable with this format.

  • Reasons for Voting “No Policy”. This question provides information that will be used by EPA to identify protest responses.

  • Respondent Certainty and Reasons for Voting. This section is designed to identify respondents who incorrectly interpreted the choice questions or the uncertainty of outcomes. Responses to these questions are important to successfully control for hypothetical bias.

  • Recreational Experience. This question elicits recreational experience data to test if certain respondent characteristics influence responses to the referendum questions. This question will also allow EPA to identify resource non-users, for purposes of estimating non-user WTP (to gauge the relative importance of non-use values to overall benefits).

  • Demographics. Responses to these questions will be used to estimate the influence of demographic variables on respondents’ voting choices, and ultimately, their WTP to prevent I&E mortality losses of fish. This information will allow EPA to use regression results to estimate WTP for populations in different regions affected by the 316(b) rule for existing facilities.

  • Comments. This section is primarily intended to help identify protest responses, i.e. responses from individuals who rejected the format of the survey or the way the questions were phrased.


3. Pretests and Pilot Tests


EPA conducted extensive pretests of the survey instrument during a set of seven focus groups (EPA ICR # 2090-0028), in addition to the twelve focus groups conducted for the Phase III survey. These focus groups included individual cognitive interviews with survey respondents (Kaplowicz et al. 2004), and think-aloud or verbal protocol analyses (Schkade and Payne 1994). Individuals in these focus groups completed draft survey questionnaires and provided comments and feedback about the survey format and content, their interpretations of the questions, and other issues relevant to stated preference estimation. Particular emphasis in these survey pretests was on testing for the presence of potential biases associated with poorly-designed stated preference surveys, including hypothetical bias, strategic bias, symbolic (warm glow) bias, framing effects, embedding biases, methodological misspecification, and protest responses (Mitchell and Carson 1989). Based on focus group and cognitive interview responses, EPA made various improvements to the survey questionnaire including changes to ameliorate and minimize these biases in the final survey instrument.

EPA intends to implement this survey in two stages. First, EPA will implement the Northeast version of this survey. EPA will use the Northeast version of the survey as a pilot study to validate the survey responses, including the following:

  • Compare the actual and expected response rates; Based on typical mail survey response rates for surveys of this type, the expected response rate is between 20% and 40% of deliverable surveys.

  • Assess whether demographic characteristics of the respondents are significantly different from the average demographic characteristics in the Northeast region

  • Check to see what proportion of respondents choose the status quo. If no one is choosing the status quo, it often indicates that the cost levels are too low. Pure random selection would result in 33% of survey respondents choosing status quo. If less than 15 - 20% of responses choose the status quo in the pilot study EPA would consider increasing the cost levels.

  • Make sure there are no strange patterns like the vast majority of respondents always choosing Option A (e.g., if 2/3 of respondents (66%) choosing option A it might indicate that there is a systematic bias).

  • Look at the follow-up questions 8 and 9 to make sure the responses seem to suggest that appropriate tradeoffs are being made and that people feel confident about responses. If either the median or the mean answer is less than 3.0 for these questions (neutral) that this would indicate a problem.

  • Examine response rates for individual survey questions and evaluate whether adjustments to survey questions are require to promote a higher response rate.

These data can be analyzed very easily using means and standard deviations without introducing significant delays in the survey implementation schedule. If required, EPA will make the appropriate adjustments to the sampling frame or attribute levels (e. g., increase or reduce the number of surveys mailed to households, or increase costs to households in the choice questions).


4. Collection Methods and Follow-up

4(a) Collection Methods


The survey will be administered as a mail survey. Respondents will be asked to mail the completed survey back to EPA.


4(b) Survey Response and Follow-up


The target response rate for the mail survey is 30 percent. That is, 30 percent of households which are sent the mail survey are expected to return a completed survey. To improve the response rate, all of these households will receive a reminder postcard approximately one week after the initial questionnaire mailing. Then, approximately three weeks after the reminder postcard, all those who have not responded will receive a second copy of the questionnaire with a revised cover letter. The following week, a letter reminding them to complete the survey will be sent.

As noted in Section 2(b), the survey sample will be selected using a stratified selection process. For the selection of households, the population of households in the contiguous 48 states and the District of Columbia will be stratified by the geographic boundaries of four EPA study regions. In addition, EPA will administer a national version of the survey that does not require stratification. We will keep track of the response rates for each of regional surveys and the national version of the survey to ensure that the rates are reasonable. We will also look at the frame characteristics of non-respondents to determine if there are any substantial biases in the estimates because of an imbalance in the distribution of certain important subgroups in the sample.

OMB approved implementation of the Northeast region of the stated preference survey as a pilot study conducted in advance of other survey versions. EPA has completed fielding both the Northeast mail survey and non-response follow-up study. For the main mail survey in the Northeast region, EPA received a total of 399 completed surveys for a 30% response rate equal to the rate assumed during development of the sampling frame.

The initial target sample sizes for the Northeast non-response survey were 73 and 36 for the Priority Mail and telephone subsamples, respectively, for 109 total non-response contacts. For the Priority Mail subsample, EPA randomly selected 146 non-responding households based on an anticipated 50% response rate (73/0.5). The anticipated response rate was based on prior studies that administered surveys via Priority Mail. As described in Section 6(c) of Part A, EPA actually received 48 completes from the Priority Mail sample giving a 33% response rate (48/146). Because the Priority Mail response was lower than expected, the target number of telephone completes was increased to obtain the desired number of responses. EPA randomly selected 331 households for the telephone survey from the subset of households with matched telephone numbers that did not complete the main mail survey or Priority Mail questionnaire. EPA made up to 12 attempts to achieve telephone contacts with the selected households. EPA stopped telephone calls after reaching the 63 completes within the 331 selected households, for a response rate of 19%. A preliminary model has been estimated for the Northeast region and weighting adjustments are being assessed based on the results of the non-response study. The remaining survey versions (Inland, Southeast, Pacific, and National) are still being fielded. EPA will implement the non-response surveys for the remaining regions in the same manor used for the Northeast survey. EPA is targeting the number of completed response presented in Table B2 but has revised estimates of agency and contractor burden for the non-response study based on the response rates for the Priority Mail and telephone Northeast non-response surveys, 33 and 19 percent respectively.


5. Analyzing and Reporting Survey Results

5(a) Data Preparation


Since the survey will be administered as a mail survey, survey responses will be scanned and entered into an electronic database after they are returned. After all responses have been entered, the database contents will be converted into a format suitable for use with a statistical analysis software package. The mail survey, database management, and data set conversion will be conducted by Abt Associates Inc.

All survey responses will be vetted for completeness. Additionally, respondents’ answers to the choice experiment questions will be tested to ensure that they are internally consistent with respect to scope and other expectations of neoclassical preference theory, such as transitivity. Responses which satisfy transitivity exhibit relational relationships when separate choices among policy options are compared. For example, if values for policy 1 are greater than policy 2, and values for policy 2 are greater than Policy 3, then values for policy 1 should also be greater than values for Policy 3.


5(b) Analysis


Once the survey data has been converted into a data file, it will be analyzed using statistical analysis techniques. The following section discusses the model that will be used to analyze the stated preference data from the survey.


Analysis of Stated Preference Data

The model for analysis of stated preference data is grounded in the standard random utility model of Hanemann (1984) and McConnell (1990). This model is applied extensively within stated preference research, and allows well-defined welfare measures (i.e., willingness to pay) to be derived from choice experiment models (Bennett and Blamey 2001; Louviere et al. 2000). Within the standard random utility model applied to choice experiments, hypothetical policy alternatives are described in terms of attributes that focus groups (Johnston et al. 1995; Adamowicz et al. 1998; Opaluch et al. 1993) reveal as relevant to respondents’ utility, or well-being. One of these attributes would include a mandatory monetary cost to the respondent’s household.

Applying this standard model to choices among policies to reduce I&E mortality losses, EPA defines a standard utility function Ui(.) that includes environmental attributes of an I&E reduction plan and the net cost of the plan to the respondent. Following standard random utility theory, utility is assumed known to the respondent, but stochastic from the perspective of the researcher, such that


(1) Ui(.) = U(Xi, D, Y-Fi) = v(Xi, D, Y-Fi) + εi


where:

Xi = a vector of variables describing attributes of I&E reduction plan i;

D = a vector characterizing demographic and other attributes of the respondent.

Y = disposable income of the respondent.

Fi = mandatory additional cost faced by the household under plan i;

v(.) = a function representing the empirically estimable component of utility;

εi = stochastic or unobservable component of utility, modeled as an econometric error.


Econometrically, a model of such a preference function is obtained by methods designed for limited dependent variables, because researchers only observe the respondent’s choice among alternative policy options, rather than observing values of Ui(.) directly (Maddala, 1983; Hanemann, 1984). Standard random utility models are based on the probability that a respondent’s utility from a policy Plan i, Ui(.), exceeds the utility from alternative Plans j, Uj(.), for all potential plans ji considered by the respondent. In this case, the respondent’s choice set of potential policies also includes maintaining the status quo. The random utility model presumes that the respondent assesses the utility that would result from each I&E reduction plan i (including the status quo), and chooses the plan that would offer the highest utility.

When faced with k distinct plans defined by their attributes, the respondent will choose plan i if the anticipated utility from plan i exceeds that of all other k-1 plans. Drawing from (1), the respondent will choose plan i if


(2) (v(Xi, D, Y-Fi) + εi) ≥ (v(Xk, D, Y-Fk) + εk) k≠i.


If the εi are assumed independently and identically drawn from a type I extreme value (Gumbel) distribution, the model may be estimated as a conditional logit model, as detailed by Maddala (1983), Greene (2003) and others. This model is most commonly used when the respondent considers more than two options in each choice set (e.g., Plan A, Plan B, Neither Plan), and results in an econometric (empirical) estimate of the systematic component of utility v(.), based on observed choices among different policy plans. Based on this estimate, one may calculate welfare measures (willingness to pay) following the well-known methods of Hanemann (1984), as described by Freeman (2003) and others. Following standard choice experiment methods (Adamowicz et al. 1998; Bennett and Blamey 2001), each respondent will consider questions including three potential choice options (i.e., Plan A, Plan B, Neither Plan)—choosing the option that provides the highest utility as noted above. Following clear guidance from the literature, a “neither plan” or status quo option is always included in the visible choice set, to ensure that WTP measures are well-defined (Louviere et al. 2000).

EPA also anticipates that respondents will consider more than one choice question within the same survey, to increase information obtained from each respondent. This is standard practice within choice experiment and dichotomous choice contingent valuation surveys (Poe et al. 1997; Layton 2000). While respondents will be instructed to consider each choice question as independent of other choice questions, it is nonetheless standard practice within the literature to allow for the potential of correlation among questions answered within a single survey by a single respondent. That is, responses provided by individual respondents may be correlated even though responses across different respondents are considered independent and identically distributed (Poe et al. 1997; Layton 2000; Train 1998).

There are a variety of approaches to such potential correlation. Following standard practice, EPA anticipates the estimation of a variety of models to assess their performance. Models to be assessed include random effects and random parameters (mixed) discrete choice models, now common in the stated preference literature (Greene 2003; McFadden and Train 2000; Poe et al. 1997; Layton 2000). Within such models, selected elements of the coefficient vector are assumed normally distributed across respondents, often with free correlation allowed among parameters (Greene 2002). If only the model intercept is assumed to include a random component, then a random effects model results. If both slope and intercept parameters may vary across respondents, then a random parameters model is estimated. EPA anticipates that such models will be estimated using standard maximum likelihood for mixed conditional logit techniques, as described by Train (1998), Greene (2002) and others. Mixed logit model performance of alternative specifications will be assessed by EPA using standard statistical measures of model fit and convergence, as detailed by Greene (2002, 2003) and Train (1998).


Advantages of Choice Experiments

Choice experiments following the random utility model outlined above are favored by many researchers over other variants of stated preference methodology (Adamowicz et al. 1998; Bennett and Blamey 2001), and may be viewed as a “natural generalization of a binary discrete choice CV [contingent valuation]” (Bateman et al. 2002, p. 271). Advantages of choice experiments include a capacity to address choices over a wide array of potential policies, grounded in well-developed random utility theory, and the similarity of the discrete choice context to familiar referendum or voting formats (Bennett and Blamey 2001). Compared to other types of stated preference valuation, choice experiments are better able to measure the marginal value of changes in the characteristics or attributes of environmental goods, and avoid response difficulties and biases (Bateman et al. 2002). For example, choice experiments may reduce the potential for ‘yea-saying’ and symbolic biases (Blamey et al. 1999; Mitchell and Carson 1989), as many pairs of multi-attribute policy choices (e.g., Plan A, Plan B, Neither) will offer no clearly superior choice for a respondent wishing to express solely symbolic environmental motivations. For similar reasons choice experiments may ameliorate protest responses (Bateman et al. 2002). An additional advantage of such methods is that they permit straightforward assessments of the impact of resource scope and scale on respondents’ choices. This will enable EPA to easily conduct scope tests and other assessments of the validity of survey responses (Bateman et al. 2002, p. 296-342). Finally, such methods are well-established in the stated preference literature (Bennett and Blamey 2001). Additional details of choice experiment methodology (also called choice modeling) are provided by Bennett and Blamey (2001), Adamowicz et al. (1998), Louviere et al. (2000) and many other sources in the literature.

An additional advantage of choice experiments in the present application is that they are commonly applied to assess WTP for ecological resource improvements of a type quite similar to those at issue in the 316(b) policy case. Examples of the application of choice experiments to estimate WTP associated with changes in aquatic life and habitat include Hoehn et al. (2004), Johnston et al. (2002b), and Opaluch et al. (1999), among others. EPA has drawn upon these and other examples of successful choice experiment design to provide a basis for survey design in the present case.

A final key advantage of choice experiments in the present application is the ability to estimate respondents’ WTP for a wide range of different potential outcomes of 316(b) policies, differentiated by their attributes. The proposed choice experiment survey versions will allow different respondents to choose among a wide variety of hypothetical policy options, some with larger and other with very small changes in the presented attributes (annual fish losses, long-term fish populations, recreational and commercial catch, ecosystem condition, and household cost). That is, because the survey is to be implemented as a choice experiment survey, levels of attributes in choice scenarios will vary across respondents (Louviere et al. 2000). The experimental design will also explicitly allow for variation in baseline population and harvest levels, following standard practice in the literature (Louviere et al. 2000; Bateman et al. 2002).

Aside from providing the capacity to estimate WTP for a wide range of policy outcomes, it also frees EPA from having to predetermine a single policy outcome for which WTP will be estimated. Given the potential biological uncertainty involved in the 316(b) policy case, the ability to estimate values for a wide range of potential outcomes is critical.

The ability to estimate WTP for a wide range of different policy outcomes is a fundamental property of the choice experiment method (Bateman et al. 2002; Louviere et al. 2000; Adamowicz et al. 1998). For the purpose of stated preference survey implementation, EPA will use four geographic regions: Northeast, Southeast, Inland, and Pacific. The Northeast regional survey is included in this ICR as Attachment 1. In addition, EPA will administer a national version of the survey that is included as Attachment 2. EPA emphasizes that the survey versions included in this ICR are for illustration only; they are but two of what will ultimately be a large number of different survey versions covering a wide range of potential policy outcomes as described in Attachment 5. The experimental design (see below) will allow for survey versions showing a range of different baseline and resource improvement levels, where these levels are chosen to (almost certainly) bound the “actual” levels. Given that there will almost certainly be some biological uncertainty regarding the specifics of the “actual” baselines and improvements, the resulting valuation estimates will allow flexibility in estimating WTP for a wide range of different circumstances. Additional details on the statistical (experimental) design of the choice experiment is provided in later sections of this ICR.



Comment on Survey Preparation and Pretesting

Following standard practice in the stated preference literature (Johnston et al. 1995; Desvousges and Smith 1988; Desvousges et al. 1984; Mitchell and Carson 1989), all survey elements and methods were subjected to extensive development and pretesting in focus groups to ameliorate the potential for survey biases (cf. Mitchell and Carson 1989), and to ensure that respondents have a clear understanding of the policies and goods under consideration, such that informed choices may be made that reflect respondents’ underlying preferences. Following the guidance of Arrow et al. (1993), Johnston et al. (1995), and Mitchell and Carson (1989), focus groups were used to ensure that respondents are aware of their budget constraints, the scope of the resource changes under consideration, and the availability of substitute environmental resources.

As noted above, survey pretests included individual cognitive interviews with survey respondents (Kaplowicz et al. 2004), and think-aloud or verbal protocol analyses (Schkade and Payne 1994). Individuals in these pretests completed draft survey questionnaires and provided comments and feedback about the survey format and content, their interpretations of the questions, and other issues relevant to stated preference estimation. Based on their responses, EPA made improvements to the survey questionnaire. Of particular emphasis in these survey pretests was testing for the presence of potential biases including hypothetical bias, strategic bias, symbolic (warm glow) bias, framing effects, embedding biases, methodological misspecification, and protest responses (Mitchell and Carson 1989). Based on focus group and cognitive interview responses, EPA made various improvements to the survey questionnaire including changes to ameliorate and minimize these biases in the final survey instrument. Results from focus groups and cognitive interviews provided evidence that respondents answer the stated preference survey in ways appropriate for stated preference WTP estimation, and that their responses generally do not reflect the biases noted above.

The number of focus groups used in survey design, seven (excluding the 12 focus groups conducted for the Phase III survey), exceeds the number of focus groups used in typical applications of stated preference valuation. Moreover, EPA incorporated cognitive interviews as detailed by Kaplowicz et al. (2004). We note that the current survey instrument is built upon an earlier version that was peer reviewed in January 2006 (Versar 2006) and it incorporates recommendations received from that peer review panel. Given this extensive effort in survey design—applying the most state-of-the-art methods available in the literature—EPA believes that survey design far exceeds standards that are typical in the published literature. The details of focus groups conducted for the previous Phase III survey are discussed by EPA in a prior ICR (#2155.01).


Econometric Specification

Based on prior focus groups, expert review, and attributes of the policies under consideration, EPA anticipates that four attributes will be incorporated in the vector of variables describing attributes of an I&E reduction plan (vector Xi), in addition to the attribute characterizing unavoidable household cost Fi.5 These attributes will characterize the annual reduction in I&E losses (x1), anticipated effects on fish populations (all fish) (x2), anticipated effects on commercial fish populations (x3), and anticipated effects on aquatic ecosystem condition (x4). These variables will allow respondents’ choices to reveal the potential impact of both annual fish losses and long-term population effects on utility. Based on results of focus groups and expert opinion, these will be presented as averages across identified aggregate species groups. The survey will also allow for changes in baseline population levels, to assess whether WTP depends on the “starting point” of fish populations.

Although the literature offers no firm guidance regarding the choice of specific functional forms for v(.) within choice experiment estimation, in practice, linear forms are often used (Johnston et al. 2003b), with some researchers applying more flexible (e.g., quadratic) forms (Cummings et al. 1994). Standard linear forms are anticipated as the simplest form to be estimated by EPA, from which more flexible functional forms (able to capture interactions among model variables) will be derived and compared. Anticipated extensions to the simple linear model include more fully-flexible forms that allow for systematic variations in slope and intercept coefficients associated with demographic or other attributes of respondents. Such variations may be incorporated by appending the simple linear specification with quadratic interactions between variables in vector D and the variables Xi and Fi (cf. Johnston et al. 2003b).

One may also incorporate quadratic interactions between policy attributes Xi and Fi, (cf. Johnston et al. 2002b). Such quadratic extensions of the basic linear model allow for additional flexibility in modeling the relationship between policy attributes (including cost) and utility, as suggested by Hoehn (1991) and Cummings et al. (1994). EPA anticipates estimating both simple linear specifications, as well as more fully-flexible quadratic specifications following Hoehn (1991) and Cummings et al. (1994), to identify those models which provide the most satisfactory statistical fit to the data and correspondence to theory. EPA anticipates estimating all models within the mixed logit framework outlined above. Model fit will be assessed following standard practice in the literature (e.g., Greene 2003; Maddala 1983). Linear and quadratic functional forms discussed here, as they are common practice in the literature, are presented and discussed in many existing sources (e.g., Hoehn 1991, Cummings et al. 1994, Johnston et al. 1999, and Johnston et al. 2003b).

For example, for each choice occasion, the respondent may choose Option A, Option B, or Neither, where “neither” is characterized by 0 values for all attributes (except Baseline population levels). Assuming that the model is estimated using a standard approximation for the observable component of utility, an econometric specification of the desired model (within the overall multinomial logit model) might appear as:


v() = 0 + 1(Fish Saved) + 2(Change in Populations of All Fish) + 3(Change in Commercial Fish Populations) + 4(Change in Condition of Aquatic Ecosystem) + 5(Cost) + 6(Fish Saved)(Baseline) + 7(Change in Populations of All Fish)(Baseline) + 8(Change in Commercial Fish Populations)(Baseline) + 9(Change in Aquatic Ecosystem)(Baseline) + 10(Cost)(Baseline) + 11(Fish Saved)(Change in Populations of All Fish) + 12(Fish Saved)(Change in Commercial Fish Populations) + 13(Fish Saved)(Change in Aquatic Ecosystem) + 14(Change in Populations of All Fish)(Change in Commercial Fish Populations) + 15(Change in Populations of All Fish)(Change in Aquatic Ecosystem) + 16(Change in Commercial Fish Populations)(Change in Aquatic Ecosystem)



Main effects are in bold. Interactions are in italics. This sample specification—one of many to be estimated by EPA—allows one to estimate the relative “main effects” of policy attributes (annual reduction in I&E losses, long-term effects on populations of all fish, long-term effects on commercial fish populations) on utility, effects on aquatic ecosystem condition, as well as interactions between these main effects. This specification also allows EPA to assess the impact of baseline fish populations on the marginal value of changes in other model attributes. In sum, specifications such as this allow WTP to be estimated for a wide-range of potential policy outcomes, and allow EPA to test for a wide-range of main effects and interactions within the utility function of respondents. Such flexible utility specifications for stated preference estimation are recommended by numerous sources in the literature, including Johnston et al. (2002b), Hoehn (1991), and Cummings et al. (1994), and follow standard practice in choice modeling outlined by Louviere et al. (2000) and others.


Experimental Design

Experimental design for the choice experiment surveys will follow established practices. Fractional factorial design will be used to construct choice questions with an orthogonal array of attribute levels, with questions randomly divided among distinct survey versions (Louviere et al. 2000). Based on standard choice experiment experimental design procedures (Louviere et al. 2000), the number of questions and survey versions will be determined by, among other factors: a) the number of attributes in the final experimental design and complexity of questions, b) the extent to which estimation of interactions and higher-level effects is desired, and c) pretests revealing the number of choice experiment questions that respondents are willing/able to answer in a single survey session, and the number of attributes that may be varied within each question while maintaining respondents’ ability to make appropriate neoclassical tradeoffs.

Based on the models proposed above and recommendations in the literature, EPA anticipates an experimental design that allows for an ability to estimate main effects, quadratic effects, and two-way interactions between policy attributes (Louviere et al. 2000). Choice sets (Bennett and Blamey 2001), including variable level selection, will be designed by EPA based on the goal of illustrating realistic policy scenarios that “span the range over which we expect respondents to have preferences, and/or are practically achievable” (Bateman et al. 2002, p. 259), following guidance in the literature. This includes guidance with regard to the statistical implications of choice set design (Hanemann and Kanninen 1999) and the role of focus groups in developing appropriate choice sets (Bennett and Blamey 2001).

Based on these guiding principles, the following experimental design framework is proposed by EPA. The experimental design will be conducted by Abt Associates Inc. The experimental design will allow for both main effects and selected interactions to be efficiently estimated, based on a choice experiment framework. For a more detailed discussion of the experimental design, refer to Attachment 5.

Each treatment (survey question) includes two choice Options (A and B), characterized by four attributes and a cost variable that vary across the two choice options (Commercial Fish Populations, Fish Populations (all fish), Fish Saved per Year, Condition of Aquatic Ecosystems, and Increase in Cost of Living of Your Household). Hence, there are a total of ten attributes for each treatment. Based on focus groups and pretests, and guided by realistic ranges of attribute outcomes, EPA allows for three different potential levels for Commercial Fish Populations, Fish Populations (all fish), Fish Saved per Year, and Condition of Aquatic Ecosystems, and allows for six different levels of annual Household Cost for the regional or national choice questions. The number of combinations for each attribute may be summarized as follows:

  • Commercial Fish PopulationsA, Commercial Fish PopulationsB (3 levels)

  • Fish Populations (all fish)A, Fish Populations (all fish)B (3 levels)

  • Fish Saved per YearA, Fish Saved per YearB (3 levels)

  • Condition of Aquatic EcosystemsA, Condition of Aquatic EcosystemsB (3 levels)

  • CostA, CostB (6 levels)


Beyond the levels specified above, each question will include a “no policy” option, characterized by baseline levels for each attribute including a household cost of $0.

Following standard practice, EPA constrained the design somewhat in response to findings in seven focus groups and the prior literature. For example, the focus groups showed that respondents react negatively and often protest when offered choices in which one option dominates the other in all attributes. Given that such choices provide negligible statistical information compared to choices involving non-dominant/dominated pairs, they are typically avoided in choice experiment statistical designs. For example, Hensher and Barnard (1990) recommend eliminating profiles including dominating or dominated profiles, because such profiles generally provide no useful information. Following this guidance, EPA constrained the design to eliminate such dominant/dominating pairs. EPA also constrained the design to eliminate the possibility of pairs in which, when looking across two options, one of the options offers both a greater reduction in fish losses and a smaller increase in the population. The elimination of such nonsensical (or non-credible) pairs is common practice, and is done to avoid protest bids and confusion among respondents (Bateman et al. 2002).

The resulting experimental design is characterized by 72 unique A vs. B option pairs, where attribute levels for option A and B differ across each of the pairs. Each pair represents a unique choice modeling question—with a unique set of attribute levels distinguishing options A and B. Following standard practice for mail surveys, these questions will be randomly assigned to survey respondents, with each respondent considering three questions.


Information Provision

According to Arrow et al. (1993, p. 4605), if “surveys are to elicit useful information about willingness to pay, respondents must understand exactly what it is they are being asked to value.” It is also well known that the provided information can influence WTP estimates derived from stated preference survey instruments and that respondents must be provided with sufficient information to make an informed assessment of policy impacts on utility (e.g., Bergstrom and Stoll 1989; Bergstrom et al. 1989; Hoehn and Randall 2002). As stated clearly by Bateman et al. (2002, p. 122), “[d]escribing the good and the policy context of interest may require a combination of textual information, photographs, drawings, maps, charts and graphs. …[V]isual aids are helpful ways of conveying complex information…while simultaneously enhancing respondents’ attention and interest.” Given that many respondents may not be fully familiar with the details of programs to reduce I&E mortality losses and potential impacts on aquatic life, the survey will include introductory figures to aid respondents’ comprehension of the goods and policies addressed by the survey instrument, and to encourage appropriate neoclassical tradeoffs in responding to choice experiment questions.

Following this guidance of Bateman et al. (2002) and prior examples of Opaluch et al. (1993) and Johnston et al. (2002a), among others, EPA extensively pretested all graphics used in the draft mail survey, to ensure that these graphical elements were not prejudicial, and that they did not bias responses. Graphics judged to be prejudicial or confusing to respondents during the seven focus groups and cognitive interviews were revised or replaced. EPA acknowledges that certain types of graphics can be prejudicial in certain contexts—and hence all graphical elements were pretested extensively. EPA found that focus group respondents endorsed the use of graphics in the survey booklet and indicated that they helped them to visualize how fish are entrained and impinged, technological solutions, facilities locations, and ecosystem effects. Participants made such statements as, “Yeah, I’d rather have them” and “I like on page 2 the graph and illustration because the adage a picture is worth a thousand words”. EPA also emphasizes that there is no precedent or support in the literature for the total elimination of graphics in survey instruments. To the contrary, the literature explicitly indicates that pictures and graphics may be necessary and useful components of survey instruments in many cases (Bateman et al. 2002). EPA highlights that numerous peer-reviewed surveys described in the literature include pictures and graphics both in survey instruments and in introductory materials such as slide shows. For example, see Horne et al. (2005), Ready et al. (1995), Powe and Bateman (2004), Duke and Ilvento (2004), Opaluch et al. (1993), Johnston et al. (1999, 2002a, 2002b), and Mazzotta et al. (2002). Bateman et al. (2002) also includes examples of various types of survey materials including pictures and graphical elements.


Amelioration of Hypothetical Bias

EPA considers the amelioration of hypothetical bias to be a paramount concern in survey design. However, the agency acknowledges—based on prior evidence from the literature—that hypothetical bias is not unavoidable. For example, not all research finds evidence of hypothetical bias in stated preference valuation (Champ and Bishop 2001; Smith and Mansfield 1998; Vossler and Kerkvliet 2003; Johannesson 1997), and some shows that hypothetical bias may be ameliorated using cheap-talk, certainty adjustments, or other mechanisms (Champ et al. 1997; Champ et al. 2004; Cummings and Taylor 1999; Loomis et al. 1996).

To obtain reliable estimates of WTP, the Agency tested and designed all survey elements to promote incentive compatible preference elicitation mechanisms. Incentive compatible stated preference surveys provide no incentive for non-truthful preference revelation (Carson and Groves 2007). The literature is clear regarding the importance of incentive compatibility in stated preference value elicitation and the role of both question format and scenario consequentiality in ensuring this property (Carson et al. 2000; Carson and Groves 2007; Collins and Vossler 2009; Herriges et al. 2010; Johnston 2006; Vossler and Evans 2009). It has been established that referendum-type stated preference choices are incentive compatible given that certain conditions are met, including the condition that responses are believed by respondents to be consequential, or potentially influencing public policy decisions (Carson and Groves 2007; Herriges et al. 2010).

The survey is explicitly designed to emphasize the importance of the budget constraint and program cost. For example, the survey asks respondents to compare protecting aquatic ecosystems to other policy issues which the government could potentially ask households to pay costs. The survey itself includes explicit reminders of program cost and the budget constraint.

The survey has also been explicitly designed to maximize the consequentiality of choice experiment questions, thereby maximizing incentive compatibility (i.e., reducing strategic and hypothetical biases), following clear guidance of Carson et al. (2000). Elements specifically designed to maximize consequentiality include: a) explicitly mentioning that this survey is associated with assessment of proposed policies that are being considered, b) numerous details provided in the survey concerning specifics of the proposed policies, and c) emphasis that the type of policy enacted will depend in part on survey results and that their vote is important. Johnston and Joglekar (2005) show the capacity of such information to eliminate hypothetical bias in choice-based stated preference WTP estimation.

Focus groups and cognitive interviews indicated that respondents viewed choices as consequential, that they considered their budget constraints when responding to all questions, and that they would answer the same way were similar questions asked in a binding referendum. When asked if they thought about the program cost in the same way as “money coming out of their pocket,” the vast majority of focus group and interview respondents indicated that they treated program costs the same way that they would have if there were actual money consequences. For example, respondents made statements such as “No. [My vote] would have been the same actually” and “If I believed that it was gonna affect regulations, I think I would have voted the exact same way.”

EPA does not anticipate significant hypothetical bias in the proposed survey based on focus group results. Focus groups respondents took the survey questions seriously and indicated that they though that their choices would actually influence policy. Regarding the potential use of cheap talk mechanisms or other devices to further address the potential for hypothetical bias, the Agency emphasizes that the literature is mixed as to their performance. For example, the seminal work by Cummings and Taylor (1999) shows that cheap talk is able to reduce hypothetical biases. Similar results are shown by Aadland and Caplan (2003). However, other authors (e.g., Cummings et al. 1995; List 2001; Brown et al. 2003) find that a cheap talk script is only effective under certain circumstances, and for certain types of respondents. For example, Cummings et al. (1995) find that a relatively short cheap talk script actually worsens hypothetical bias, while a longer script appears to ameliorate bias. Brown et al. (2003) finds cheap talk only effective at higher bid amounts—a result mirrored by Murphy et al. (2004). Still other authors find no effect of cheap talk, including Poe et al. (2002). Given the clearly mixed experiences with such mechanisms, EPA is not convinced that cheap talk scripts are likely to provide a panacea for hypothetical bias in the present case—although they appear to reduce bias in a limited set of circumstances – and cheap talk is not included in the survey.


Amelioration of Symbolic Biases and Warm-Glow Effects

Following clear guidance of Arrow et al. (1993) and others, EPA has taken repeated steps to ensure that survey responses reflect the value of the affected fish resources only, and do not reflect symbolic or warm glow concerns (Mitchell and Carson 1989). Following explicit guidance of the NOAA Blue Ribbon Panel on Contingent Valuation (Arrow et al. 1993, p. 4609), EPA has explicitly designed all elements of the survey to “deflect the general ‘warm glow’ of giving or the dislike of ‘big business’ away from the specific program that is being valued.” This was done in a variety of ways, based on prior examples in the literature, such as asking respondents to reflect on importance of attributes before making selections, and using a payment vehicle that doesn’t raise trust issues (cost of living increase rather than an electric bill increase). The focus group and cognitive interview results indicated that most participants answered the choice questions based on the effects discussed in the survey, not on a desire to help the environment in general.

The survey includes clear language to instruct respondents only to consider the specific attributes in the survey, and not to base answers on broader environmental concerns including the statement that:

Scientists expect that effects on the environment and economy not shown explicity will be small. For example, studies of industry suggest that effects on employment will be close to zero.”

This is also consistent with the statement from Arrow et al. (1993) that a referendum-type format may limit the warm-glow effect. Some focus group participants indicated that they were inclined to support environmental causes and would like “to do a good thing” but still considered the cost and effects under the policy options. For example, respondents stated, “[…] if we can do something to help as long as the price is right, then do it” and “I feel if it’s going to be benefit everyone and be better for the economy, I’m OK with paying a little bit more.”

This evidence notwithstanding, EPA believes that it is important to include follow-up questions to ensure that responses do not reflect symbolic biases. Question 9 in the survey instrument—which addresses the rationale for choice responses given earlier in the draft survey—explicitly tests for the presence of symbolic or warm-glow biases. Follow-up questions such as these are common in stated preference survey instruments, to assess the underlying reasons for the observed valuation responses (e.g., Mitchell and Carson 1984).


Assessing Scope Sensitivity

The NOAA Blue Ribbon Panel on Contingent Valuation (CV) (Arrow et al. 1993, p. 4605) states clearly that if “surveys are to elicit useful information about willingness to pay, respondents must understand exactly what it is they are being asked to value (or vote upon)…” They further indicate that surveys providing “sketchy details” about the results of proposed policies call “into question the estimates derived there from,” and hence suggest a high degree of detail and richness in the descriptions of scenarios. Similar guidance is provided by other key sources in the CV literature (e.g., Mitchell and Carson 1989; Louviere et al. 2000). Among the reasons for this guidance is that such descriptions tend to encourage appropriate framing and sensitivity to scope.

Following Arrow et al. (1993), Mitchell and Carson (1989), and others, while noting the clear limitations in scope tests discussed by Heberlein et al. (2005), EPA believes that it is important that survey responses in this case show sensitivity to scope. This is one of the primary reasons for the use of the choice experiment methodology, which is better able to capture WTP differentials related to changes in resource scope (Bateman et al. 2002). Unlike open-ended questions, in which scope insensitivity is a primary concern, EPA emphasizes that choice experiments generally have shown much less difficulty with respondents reacting appropriately to the scope and scale of resource changes. Moreover, as clearly noted by Bennett and Blamey (2001, p. 231), “internal scope tests are automatically available from the results of a [choice modeling] exercise.” That is, within choice experiments, sensitivity to scope is indicated by the statistical significance and sign of parameter estimates associated with program attributes (Bennett and Blamey 2001). Internal scope sensitivity will therefore be assessed through model results for the variables Commercial Fish Populations, Fish Populations (all fish), Fish Saved per Year, and Condition of Aquatic Ecosystems. Statistical significance of these variables—along with a positive sign—indicates that respondents, on average, are more likely to choose plans with larger quantities of these variables.

In addition to internal scope tests implicit in all choice experiment statistical analysis, EPA will also conduct external scope tests (cf. Giraud et al. 1999). The primary difference between internal and external tests is that the former assess sensitivity to scope across choices of a single respondent, while the latter involves split-sample assessments across different respondents. Within a choice modeling context, external scope tests are generally considered “stronger,” although also more likely to be confounded by differences in the implied choice frame (Bennett and Blamey 2001). A variety of options for external scope tests exist, depending on the structure of the stated choice questions under consideration.

In the present case, attribute-by-attribute external scope tests will be conducted over a split sub-sample of respondents considering a specific set of choices, with all held constant across the considered choices except the scope of the attribute for which the test is to be conducted. For example, to conduct an external scope test for reductions in annual fish losses, one would consider a set of choices that is identical over two respondent groups, except that one considers a choice with a greater reduction in fish losses. Assessing the choices over this split sample allows for an external test of scope. To illustrate this test, consider the following stylized choice between Option A and Option B. The generic labels “Level 0”, “Level 1”, and “Level 2” are used to denote attribute levels, where for all attributes Level 2 > Level 1 > Level 0.


Table B3: Illustration of an External Scope Test

Variable

Option A

Option B

Fish Saved per Year


Sample 1: Fish Saved Level 1

Sample 2: Fish Saved Level 2

Fish Saved Level 0

Commercial Fish Populations

Commercial Fish Populations

Level 0

Commercial Fish Populations

Level 0

Fish Populations (all fish)

Population (all fish) Level 0

Population (all fish) Level 0

Condition of Aquatic Ecosystems

Aquatic Ecosystem Condition

Level 0

Aquatic Ecosystem Condition

Level 0

Increase in Cost of Living for Your Household

Cost Level 1

Cost Level 0



In the above example, only Fish Saved per Year and Cost vary across the choice options. Because both Fish Saved per Year (at Level 1 and Level 2) and Cost are higher in Option A than in Option B, neither option is dominant. In the illustrated split-sample test, respondent sample 1 views the choice with Fish Saved per Year at Level 1, while respondent sample 2 views an otherwise identical choice with Fish Saved per Year at Level 2, where Level 2 > Level 1. If responses are externally sensitive to scope in Fish Saved per Year, this will manifest in a greater proportion of sample 2 respondents choosing Option A than sample 1 respondents. This hypothesis may be easily assessed using a test of equal proportions across the two sub-samples, and provides a simple attribute-by-attribute test of external scope. Analogous tests may be conducted for all attributes within the choice experiment design, using parallel methods. EPA emphasizes that the formal applicability of the above-noted scope test is contingent upon the specific choice frame implied by levels of other attributes in the choice question. This is a characteristic of nearly all external scope tests applied in choice experiment frameworks (Bennett and Blamey 2001).

Split-sample tests such as those proposed above often require the addition of question versions to the experimental design, to accommodate the specific structural needs of the attribute-by-attribute external scope test. Otherwise, confounding effects of other varying attributes (including demographic information) can render results of scope tests ambiguous. In the present case, the proposed tests would require the addition of up to six unique question versions to the experimental design, enabling scope tests for the three non-cost attributes within the 316(b) choice experiment scenarios. If scope tests in additional question frames are desired (e.g., the same scope test illustrated above, but given Level 1 for commercial fish populations, fish population (all fish), and aquatic ecosystem condition attributes), still additional question versions would be added. While small numbers of questions added to the experimental design should have minimal impacts on overall efficiency (e.g., orthogonality of the design), larger numbers may have a more significant impact. Hence, given constraints on the total number of survey respondents, there is a potential empirical tradeoff between the number of external scope tests that may be conducted and the efficiency of the experimental design and statistical analysis.


Communicating Uncertainty to Respondents

EPA believes that the role of risk and uncertainty is an important issue to be addressed in the development of benefits estimates, and points out that the literature provides numerous examples of cases in which appropriate survey design, including focus groups, was used to successfully address such concerns. For example, as stated by Desvousges et al. (1984), “using contingent valuation to estimate the benefits of hazardous waste management regulations requires detailed information on how and the extent to which respondents understand risk (or probability) and how government regulatory actions might change it… Using focus groups helped make this determination…” EPA also emphasizes that all regulatory analyses involve uncertainty of some type (Boardman et al. 2001).

The ecological outcome of I&E reductions is subject to considerable uncertainty. EPA believes that it is important that survey respondents be aware of this uncertainty, and that their responses reflect the knowledge that the resource changes reflected in the survey are scientific estimates. However, EPA is also aware of the clear advice from the choice modeling literature (e.g., Bennett and Blamey 2001; Louviere et al. 2000) to avoid cognitive burden on respondents. Hence, the proposed survey materials clearly indicate the uncertainty involved with the described resource changes in choice modeling scenarios, yet do so in a way designed to minimize cognitive burden.

For example, prior to answering choice experiment questions, respondents are told:

Although scientists can predict the number of fish saved each year, the effect on fish populations is uncertain. This is because scientists do not know the total number of fish in Northeast waters and because many factors – such as cooling water use, fishing, pollution and water temperature – affect fish.”

This statement clearly indicates the uncertainty involved with scientific estimates of the outcomes of I&E regulations. This is followed by a further reminder of uncertainty:

Depending on the type of technology required and other factors, effects on fish and ecosystems may be different – even if the annual reduction in fish losses is similar.”

Focus groups and cognitive interviews participants understood that the ecological changes described in the survey were uncertain, and most participants were comfortable making decisions in the presence of this uncertainty. Their responses indicated that they understood this uncertainty based on the information presented in the introductory material and considered it when evaluating policy options. Respondents made such statements as: “My guess is that it did come from studies but I have a healthy dose of skepticism about the accuracy of it. I don’t think it’s been in any way skewed purposefully, but I know that this is a best guess, reasonable guess perhaps”, “it shows me that they are being honest for the most part. You know, you can't obviously be accurate on everything, but this is a kind of a best guess”, “[…] They had more numbers on the commercial fish population. The rest was more of a guesstimate”, and “you don’t know the exact number and nobody knows.”

In previous focus groups conducted for the Phase III survey, EPA tested alternative versions of the Phase III survey instrument in which choice experiment attributes were presented as 90% confidence ranges, rather than as point estimates. Focus group respondents were explicitly asked whether the ranges were helpful in understanding the uncertainty of estimates presented in the choice question or whether they were a source of confusion. Seven out of the eight respondents interviewed on that occasion indicated that the use of ranges was more confusing than the use of point estimates. Furthermore, respondents were comfortable making decisions in the presence of this uncertainty.


5(c) Reporting Results


The results of the survey will be made public as part of the benefits analysis for the 316(b) regulation for existing facilities. Provided information will include summary statistics for the survey data, extensive documentation for the statistical analysis, and a detailed description of the final results. The survey data will be released only after it has been thoroughly vetted to ensure that all potentially identifying information has been removed.

REFERENCES


Aadland, D., and A.J. Caplan. 2003. “Willingness to Pay for Curbside Recycling with Detection and Mitigation of Hypothetical Bias.” American Journal of Agricultural Economics 85(2): 492-502.

Adamowicz, W., P. Boxall, M. Williams, and J. Louviere. 1998. “Stated Preference Approaches for Measuring Passive Use Values: Choice Experiments and Contingent Valuation.” American Journal of Agricultural Economics 80(1): 64-75.

Akter, S., R. Brower, L. Brander, and P. Van Beukering. 2009. “Respondent Uncertainty in a Contingent Market for Carbon Offsets.” Ecological Economics 68(6): 1858-1863.

Arrow, K. , R. Solow, E. Leamer, P. Portney, R. Rander, and H. Schuman. 1993. “Report of the NOAA Panel on Contingent Valuation.” Federal Register 58 (10): 4602-4614.

Bateman, I.J., R.T. Carson, B. Day, M. Hanemann, N. Hanley, T. Hett, M. Jones-Lee, G. Loomes, S. Mourato, E. Ozdemiroglu, D.W. Pierce, R. Sugden, and J. Swanson. 2002. Economic Valuation with Stated Preference Surveys: A Manual. Northampton, MA: Edward Elgar.

Bennett, J. and R. Blamey, eds. 2001. The Choice Modelling Approach to Environmental Valuation. Northampton, MA: Edward Elgar.

Bergstrom, J.C. and J.R. Stoll. 1989. “Aplication of experimental economics concepts and precepts to CVM field survey procedures.” Western Journal of Agricultural Economics 14(1): 98-109.

Bergstrom, J.C., J.R. Stoll, and A. Randall. 1989. “Information effects in contingent markets.” American Journal of Agricultural Economics 71(3): 685-691.

Besedin, Elena, Robert Johnston, Matthew Ranson, and Jenny Ahlen, Abt Associates Inc. 2005. “Findings from 2005 Focus Groups Conducted Under EPA ICR #2155.01.” Memo to Erik Helm, U.S. EPA/OW, October 18, 2005. See docket for EPA ICR #2155.02

Blamey, R.K., J.W. Bennett, and M.D. Morrison. 1999. “Yea-saying in Contingent Valuation Surveys.” Land Economics 75: 126-141.

Boardman, A.E., D.H. Greenberg, A.R. Vining, and D.L. Weimer. 2001. Cost-Benefit Analysis: Concepts and Practice, 2nd edition. Upper Saddle River, NJ: Prentice Hall.

Boyle, K.J. 2003. “Contingent valuation in practice.” In A Primer on Nonmarket Valuation. Edited by P.A. Champ, K.J. Boyle, and T.C. Brown, Kluwer Academic Publishers.

Brown, T. C., I. Ajzen, and D. Hrubes. 2003. “Further Tests of Entreaties to Avoid Hypothetical Bias in Referendum Contingent Valuation.” Journal of Environmental Economics and Management 46(2): 353-361.

Bunch, D.S., and R.R. Batsell. 1989. “A Monte Carlo Comparison of Estimators for the Multinomial Logit Model.” Journal of Marketing Research 26: 56-68.

Cameron, T.A., and D.D. Huppert. 1989. “OLS versus ML Estimation of Non-market Resource Values with Payment Card Interval Data.” Journal of Environmental Economics and Management 17: 230-246.

Carson, R.T., and T. Groves. 2007. “Incentives and informational properties of preference questions.” Environmental and Resource Economics 37(1): 181-210.

Carson, R.T., T. Groves, and M.J. Machina. 2000. “Incentive and Informational Properties of Preference Questions.” Working Paper, Department of Economics, University of California, San Diego.

Champ P.A., and R.C. Bishop. 2001. “Donation Payment Mechanisms and Contingent Valuation: An Empirical Study of Hypothetical Bias.” Environmental and Resource Economics 19(4): 383-402.

Champ, P.A., R.C. Bishop, T.C. Brown, and D.W. McCollum. 1997. “Using Donation Mechanisms to Value Non-use Benefits from Public Goods.” Journal of Environmental Economics and Management 33(2): 151-162.

Champ, P.A., R. Moore, and R. C. Bishop. 2004. “Hypothetical Bias: The Mitigating Effects of Certainty Questions and Cheap Talk.” Selected paper prepared for presentation at the American Agricultural Economics Association Annual Meeting, Denver, Colorado.

Champ, P.A., R. Moore, and R.C. Bishop. 2009. “A Comparison of Approaches to Mitigate Hypothetical Bias.” Agricultural and Resource Economics Review 38(2): 166-180.

Collins, J.P., and C.A. Vossler. 2009. “Incentive compatibility tests of choice experiment value elicitation questions.” Journal of Environmental Economics and Management 58(2): 226-235.

Croke, K., R.G. Fabian, and G. Brenniman. 1986. “Estimating the Value of Improved Water Quality in an Urban River System.” Journal of Environmental Systems 16(1): 13-24.

Cronin, F.J. 1982. “Valuing Nonmarket Goods Through Contingent Markets.” Pacific Northwest Laboratory, PNL 4255, Richland, WA.

Cummings, R.G., and G.W. Harrison. 1995. “The Measurement and Decomposition of Non-use Values: A Critical Review.” Environmental and Resource Economics 5: 225-247.

Cummings, R. G., G.W. Harrison, and L.L. Osborne. 1995. “Can the Bias of Contingent Valuation Surveys Be Reduced?” Economics working paper, Columbia, SC: Division of Research, College of Business Administration, Univ. of South Carolina.

Cummings, R.G., P.T. Ganderton, and T. McGuckin. 1994. “Substitution Effects in CVM Values.” American Journal of Agricultural Economics 76(2): 205-214.

Cummings, R.G., and L.O. Taylor. 1999. “Unbiased Value Estimates for Environmental Goods: A Cheap Talk Design for the Contingent Valuation Method.” American Economic Review 89(3): 649-665.

Desvousges, W.H., and V.K Smith. 1988. “Focus Groups and Risk Communication: the Science of Listening to Data.” Risk Analysis 8: 479-484.

Desvousges, W.H., V.K. Smith, D.H. Brown, and D.K. Pate. 1984. “The Role of Focus Groups in Designing a Contingent Valuation Survey to Measure the Benefits of Hazardous Waste Management Regulations.” Research Triangle Institute: Research Triangle Park, NC.

Dillman, D.A. 2008. Mail and Internet Surveys: The Tailored Design Method. New York: John Wiley and Sons.

Duke, J.M., and T.W. Ilvento. 2004. “A Conjoint Analysis of Public Preferences for Agricultural Land Preservation.” Agricultural and Resource Economics Review 33(2): 209-219.

Entergy Corp. v. Riverkeeper Inc., 129 S. Ct. 1498, 1505 (2009)

Freeman, A.M., III. 2003. The Measurement of Environmental and Resource Values: Theory and Methods. Washington, DC: Resources for the Future.

Giraud, K.L., J.B. Loomis, and R.L. Johnson. 1999. “Internal and external scope in willingness-to-pay estimates for threatened and endangered wildlife.” Journal of Environmental Management 56: 221-229.

Greene, W.H. 2002. NLOGIT Version 3.0 Reference Guide. Plainview, NY: Econometric Software, Inc.

Greene, W.H. 2003. Econometric Analysis. 5th ed., Prentice Hall, Upper Saddle River, NJ.

Haab, T.C., and K.E. McConnell. 2002. Valuing Environmental and Natural Resources: The Econometrics of Non-market Valuation. Cheltenham, UK: Edward Elgar.

Hanemann, W.M. 1984. “Welfare Evaluations in Contingent Valuation Experiments with Discrete Responses.” American Journal of Agricultural Economics 66(3): 332-41.

Hanemann, W.M., and B. Kanninen. 1999. “The Statistical Analysis of Discrete-Response CV Data.” In Valuing Environmental Preferences: Theory and Practice of the Contingent Valuation Method in the US, EU, and Developing Countries. Edited by I.J. Bateman and K.G. Willis, Oxford University Press, Oxford, UK.

Hanley, N., S. Colombo, D. Tinch, A. Black, and A. Aftab. 2006a. "Estimating the benefits of water quality improvements under the Water Framework Directive: are benefits transferable?," European Review of Agricultural Economics 33(3):391-413.

Hanley, N., R. E. Wright, and B. Alvarez-Farizo. 2006b. “Estimating the Economic Value of Improvements in River Ecology using Choice Experiments: An Application to the Water Framework Directive.” Journal of Environmental Management 78(2):183-193.

Heberlein, T.A., M.A. Wilson, R.C. Bishop, and N.C. Schaeffer. 2005. “Rethinking the Scope Test as a Criterion in Contingent Valuation.” Journal of Environmental Economics and Management 50(1): 1-22.

Hensher, D. A., and Barnard, P. O. (1990). "The Orthogonality Issue in Stated Choice Designs." In Fischer, M., Nijkamp, P., and Papageorgiou, Y. (eds.), Spatial Choices and Processes. North-Holland, Amsterdam, 265-278.

Herriges, J., C. Kling, C. Lieu, and J. Tobias. 2010. “What are the consequences of consequentiality?” Journal of Environmental Economics and Management 59(1): 67-81.

Hoehn, J. P. 1991. “Valuing the Multidimensional Impacts of Environmental Policy: Theory and Methods.” American Journal of Agricultural Economics 73(2): 289-299.

Hoehn, J.P., F. Lupi, and M.D. Kaplowitz. 2004. Internet-Based Stated Choice Experiments in Ecosystem Mitigation: Methods to Control Decision Heuristics and Biases. In Proceedings of Valuation of Ecological Benefits: Improving the Science Behind Policy Decisions, a workshop sponsored by the US EPA National Center for Environmental Economics and the National Center for Environmental Research.

Hoehn, J.P., and A. Randall. 2002. “The Effect of Resource Quality Information on Resource Injury Perceptions and Contingent Values.” Resource and Energy Economics 24: 13-31.

Horne, P., P.C. Boxall, and W.L. Adamowicz. 2005. “Multiple-use management of forest recreation sites: a spatially explicit choice experiment.” Forest Ecology and Management 207(1/2): 189-99.

Johannesson, M. 1997. “Some Further Experimental Results on Hypothetical Versus Real Willingness to Pay.” Applied Economics Letters 4: 535-536.

Johnston, R.J., E.T. Schultz, K. Segerson and E.Y. Besedin. 2010. “Bioindicator-Based Stated Preference Valuation for Aquatic Habitat and Ecosystem Service Restoration”, In Bennett, J. ed. International Handbook on Non-Marketed Environmental Valuation. Cheltenham, UK: Edward Elgar, forthcoming.

Johnston, R.J. 2006. “Is Hypothetical Bias Universal? Validating Contingent Valuation Responses Using a Binding Public Referendum.” Journal of Environmental Economics and Management 52(1):469-481.

Johnston, R.J., and D.P. Joglekar. 2005. “Validating Hypothetical Surveys Using Binding Public Referenda: Implications for Stated Preference Valuation.” American Agricultural Economics Association (AAEA) Annual Meeting, Providence, July 24-27.

Johnston, R.J., J.J. Opaluch, M.J. Mazzotta, and G. Magnusson. 2005. “Who Are Resource Non-users and What Can They Tell Us About Non-use Values? Decomposing User and Non-user Willingness to Pay for Coastal Wetland Restoration.” Water Resources Research 41(7), doi:10.1029/2004WR003766.

Johnston, R.J., E.Y. Besedin, and R.F. Wardwell. 2003a. “Modeling Relationships Between Use and Non-use Values for Surface Water Quality: A Meta-Analysis.” Water Resources Research 39(12): 1363.

Johnston, R.J., S.K. Swallow, T.J. Tyrrell, and D.M. Bauer. 2003b. “Rural Amenity Values and Length of Residency.” American Journal of Agricultural Economics 85(4): 1000-1015.

Johnston, R.J., G. Magnusson, M. Mazzotta, and J.J.Opaluch. 2002a. “Combining Economic and Ecological Indicators to Prioritize Salt Marsh Restoration Actions.” American Journal of Agricultural Economics 84(5): 1362-1370.

Johnston, R.J., S.K. Swallow, C.W. Allen, and L.A. Smith. 2002b. “Designing Multidimensional Environmental Programs: Assessing Tradeoffs and Substitution in Watershed Management Plans.” Water Resources Research 38(7): IV1-13.

.Johnston, R.J., S.K Swallow and T.F. Weaver. 1999. “Estimating Willingness to Pay and Resource Trade-offs With Different Payment Mechanisms: An Evaluation of a Funding Guarantee for Watershed Management.” Journal of Environmental Economics and Management 38(1): 97-120.

Johnston, R.J., T.F. Weaver, L.A. Smith, and S.K. Swallow. 1995. “Contingent Valuation Focus Groups: Insights From Ethnographic Interview Techniques.” Agricultural and Resource Economics Review 24(1): 56-69.

Just, R.E., D.L. Hueth, and A. Schmitz. 2004. The Welfare Economics of Public Policy: A Practical Approach to Project and Policy Evaluation. Edward Elgar Publishing, Cheltenham, UK and Northampton, MA.

Kaplowicz, M.D., F. Lupi, and J.P. Hoehn. 2004. “Multiple Methods for Developing and Evaluating a Stated-Choice Questionnaire to Value Wetlands.” Chapter 24 in Methods for Testing and Evaluating Survey Questionnaires, eds. S. Presser, J.M. Rothget, M.P. Coupter, J.T. Lesser, E. Martin, J. Martin, and E. Singer. New York: John Wiley and Sons.

Kobayashi, M.K. Rollins, and M.D.R Evans. 2010. “Sensitivity of WTP Estimates to Definition of 'Yes': Reinterpreting Expressed Response Intensity.” Agricultural and Resource Economics Review 39(1): 37-55.

Kuhfeld, W.F. 2009. “Experimental Design: Efficiency, Coding and Choice Designs.” SAS Institute. http://support.sas.com/techsup/tnote/tnote_stat.html#market.

Layton, D.F. 2000. “Random coefficient models for stated preference surveys.” Journal of Environmental Economics and Management 40(1): 21-36.

List, J.A. 2001. “Do Explicit Warnings Eliminate the Hypothetical Bias in Elicitation Procedures? Evidence from Field Auctions for Sportscards.” American Economic Review 91(5): 1498-1507.

Loomis, J., T. Brown, B. Lucero, and G. Peterson. 1996. “Improving Validity Experiments of Contingent Valuation Methods: Results of Efforts to Reduce the Disparity of Hypothetical and Actual Willingness to Pay.” Land Economics 72(4): 450-461.

Louviere, J.J., D.A. Hensher, and J.D. Swait. 2000. Stated Preference Methods: Analysis and Application. Cambridge, UK: Cambridge University Press.

Maddala, G.S. 1983. “Limited-Dependent and Qualitative Variables in Econometrics.” Econometric Society Monographs No. 3, Cambridge University Press, Cambridge.

Mazzotta, M.J., J.J. Opaluch, G. Magnuson, and R.J. Johnston. 2002. “Setting Priorities for Coastal Wetland Restoration: A GIS-Based Tool That Combines Expert Assessments And Public Values.” Earth System Monitor 12(3): 1-6.

McConnell, K.E. 1990. “Models for Referendum Data: The Structure of Discrete Choice Models for Contingent Valuation.” Journal of Environmental Economics and Management 18(1): 19-34.

McFadden, D., and K. Train. 2000. “Mixed Multinomial Logit Models for Discrete Responses.” Journal of Applied Econometrics 15(5): 447-470.

Mitchell, R.C., and R.T. Carson. 1981. An Experiment in Determining Willingness to Pay for National Water Quality Improvements. Preliminary draft of a report to the U.S. Environmental Protection Agency. Resources for the Future, Inc., Washington.

Mitchell, R.C., and R.T. Carson. 1984. A Contingent Valuation Estimate of National Freshwater Benefits: Technical Report to the U.S. Environmental Protection Agency. Washington, DC: Resources for the Future.

Mitchell, R.C., and R.T. Carson. 1989. Using Surveys to Value Public Goods: The Contingent Valuation Method. Resources for the Future, Washington, D.C.

Morrison, M., and J. Bennett. 2004. Valuing New South Wales rivers for use in benefit transfer. Australian Journal Of Agricultural And Resource Economics 48(4): 591-611.

Murphy, J.J., T. Stevens, and D. Weatherhead. 2004. “Is Cheap Talk Effective at Eliminating Hypothetical Bias in a Provision Point?” Working Paper No. 2003-2. Department of Resource Economics, University of Massachusetts, Amherst.

Olsen, D., J. Richards, and R.D. Scott. 1991. “Existence and Sport Values for Doubling the Size of Columbia River Basin Salmon and Steelhead Runs.” Rivers 2(1): 44-56.

Opaluch, J.J., T.A. Grigalunas, M. Mazzotta, R.J. Johnston, and J. Diamantedes. 1999. Recreational and Resource Economic Values for the Peconic Estuary. Prepared for the Peconic Estuary Program. Peace Dale, RI: Economic Analysis Inc. 124 pp.

Opaluch, J.J., S.K. Swallow, T. Weaver, C. Wessells, and D. Wichelns. 1993. “Evaluating impacts from noxious facilities: Including public preferences in current siting mechanisms.” Journal of Environmental Economics and Management 24(1): 41-59.

Poe, G. L., J.E. Clark, D. Rondeau, and W.D. Schulze. 2002. “Provision Point Mechanisms and Field Validity Tests of Contingent Valuation.” Environmental and Resource Economics 23: 105-131.

Poe, G.L., M.P. Welsh, and P.A. Champ. 1997. “Measuring the Difference in Mean Willingness to Pay when Dichotomous Choice Contingent Valuation Responses are not Independent.” Land Economics 73(2): 255-267.

Powe, N.E. 2007. Redesigning Environmental Valuation: Mixing Methods within Stated Preference Techniques. Cheltenham, UK: Edward Edgar.

Powe, N.A., and I.J. Bateman. 2004. “Investigating Insensitivity to Scope: A Split-Sample Test of Perceived Scheme Realism.” Land Economics 80(2): 258-271.

Ready, R.C., P.A. Champ, and J.L. Lawton. 2010. “Using Respondent Uncertainty to Mitigate Hypothetical Bias in a Stated Choice Experiment.” Land Economics 86(2): 363-381.

Ready, R.C., J.C. Whitehead, and G.C. Blomquist. 1995. “Contingent Valuation When Respondents are Ambivalent.” Journal of Environmental Economics and Management 29(2): 181-196.

Smith, V. K., and C. Mansfield. 1998. “Buying Time: Real and Hypothetical Offers.” Journal of Environmental Economics and Management 36: 209-224.

Schkade, D.A. and J.W. Payne. 1994. “How People Respond to Contingent Valuation Questions: A Verbal Protocol Analysis of Willingness to Pay for an Environmental Regulation.” Journal of Environmental Economics and Management 26: 88-109.

Train, K. 1998. “Recreation Demand Models with Taste Differences Over People.” Land Economics 74(2): 230-239.

U.S. Department of Labor, Bureau of Labor Statistics. 2009. Table 1: Civilian workers, by major occupational and industry group. September 2009. http://www.bls.gov/news.release/ecec.t01.htm.

U.S. EPA. 2000. Guidelines for Preparing Economic Analyses. (EPA 240-R-00-003). U.S. EPA, Office of the Administrator, Washington, DC, September 2000.

U.S. EPA. 2006. Peer Review Handbook 3rd Edition. (EPA 100-B-06-002). U.S. EPA, Science Policy Council, Washington, DC, 2006.

Versar. 2006. Comments Summary Report: Peer Review Package for "Willingness to Pay Survey Instrument for §316(b) Phase III Cooling Water Intake Structures.” Prepared by Versar Inc., Springfield, VA.

Vossler, C.A., and M.F. Evans. 2009. “Bridging the gap between the field and the lab: Environmental goods, policy maker input, and consequentiality.” Journal of Environmental Economics and Management 58(3):338-345.

Vossler, C.A., and J. Kerkvliet. 2003. “A Criterion Validity Test of the Contingent Valuation Method: Comparing Hypothetical and Actual Voting Behavior for a Public Referendum.” Journal of Environmental Economics and Management 45(3): 631-649.

Whitehead, J.C., G.C. Blomquist, T.J.Hoban, and W.B. Clifford. 1995. “Assessing the Validity and Reliability of Contingent Values: A Comparison of On Site Users, Off Site Users, and Non-users.” Journal of Environmental Economics and Management 29(2): 238-251.

Whitehead, J.C., and P.A. Groothuis. 1992. “Economic Benefits of Improved Water Quality: a Case Study of North Carolina's Tar Pamlico River.” Rivers 3: 170-178.

Attachment 1: Full Text of Regional Stated Preference Survey Component (Northeast Regional Example)

OMB Control No. 2040-XXXX

Approval expires XX/XX/XX





Fish and Aquatic Habitat

A Survey of Northeast Residents

(CT, DC, DE, MA, MD, ME, NH, NJ, NY, PA, RI, VT)

The public reporting and recordkeeping burden for this collection of information is estimated to average 30 minutes per response. Send comments on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including through the use of automated collection techniques to the Director, Collection Strategies Division, U.S. Environmental Protection Agency (2822T), 1200 Pennsylvania Ave., NW, Washington, D.C. 20460. Include the OMB control number in any correspondence. Do not send the completed survey to this address.



Human Activities, Aquatic Habitat and Fish

This survey asks for your opinions regarding policies that would affect fish and habitat in the Northeast U.S. Your answers will help the government decide which policies will be enacted. Background information in this survey was provided by the National Marine Fisheries Service, U.S. Environmental Protection Agency, U.S. Geological Survey and other state and federal offices.


Northeast fresh and salt waters support billions of fish. These include fish that are used by humans, as well as forage fish that are not used by humans, but serve as food for larger fish, birds, and animals.



This survey concerns proposed policies that would reduce fish losses caused by cooling water use by industrial facilities, including factories and power plants. These policies would benefit aquatic ecosystems but would increase the costs of some goods and services you buy, including electricity and common household products.

1

1



How Does Cooling Water Affect Fish?

The water that industrial facilities use to cool equipment is pumped from bays, rivers, and lakes. The largest amount is used by power plants that produce electricity.


Cooling Water Intake Screen



How Fish Are Affected by Water Intake


Cooling water use affects fresh and salt waters throughout the Northeast US, but 93% of all fish losses are in coastal bays, estuaries, and tidal rivers.




2



what kinds of Fish are Affected?




Cooling water use is not the largest cause of fish loss in most areas (fishing causes greater losses), but has affected some fish populations.


About 1/6 of the fish lost are species caught by commercial and recreational fishermen. Examples include striped bass, flounder, and cod.


The other 5/6 of the fish lost are forage species not caught by humans but serve as food for larger fish, birds, and animals. Examples include killifish, silverside, and stickleback.


Question 1. When thinking about how industrial facilities use cooling water, please rate the importance of the following to you. Check one box for each.



Not Important



Somewhat Important




Very Important







  1. Preventing the loss of fish that are caught by humans

1

2

3

4

5

  1. Preventing the loss of fish that are not caught by humans

1

2

3

4

5

  1. Maintaining the ecological health of rivers, lakes and bays

1

2

3

4

5

  1. Keeping the cost of goods and services low

1

2

3

4

5

  1. Making sure there is enough government regulation of industry

1

2

3

4

5

  1. Making sure there is not too much government regulation of industry

1

2

3

4

5



3

3



how many fish are affected?





After accounting for the number of eggs and larvae that would be expected to survive to adulthood, scientists estimate that the equivalent of about 1.1 billion young adult fish (the equivalent of one year old) are lost each year in Northeast coastal and fresh waters due to cooling water use.


Scientists can predict the number of these fish that will be saved under different policies. This number ranges from less than 0.1 to 1.0 billion fish saved per year.


For commercial fish species, losses of young fish in cooling water intakes vary by species, from the equivalent of less than 0.1% to about 10% of a species’ total population.


Scientists expect the yearly effects on other fish species are in the same 0.1% to 10% range. The number of young fish lost in cooling water intakes relative to the total number of fish in the water is relatively high for some species, but low for others.



Although scientists can predict the number of fish saved each year, the effect on fish populations is uncertain. This is because scientists do not know the total number of all fish in Northeast waters and because many factors – such as cooling water use, fishing, pollution and water temperature – affect fish.


The following page provides information on policies that would be required to reduce these fish losses.

Smaller effect on Striped Bass

Larger effect on Winter Flounder

4

Smaller effect on Striped Bass

Larger effect on Winter Flounder

4

4



New Regulations are Being Proposed to protect fish







Advanced filters and closed cycle cooling are already in use at many facilities and are proven technologies. New regulations would require a mix of advanced filters and closed cycle cooling at all facilities—with reductions in fish losses between 5% and 95%.



5



How important are these issues to you?



While these policies would reduce fish losses, they would also increase the costs of producing many goods and services — these costs would be passed on to consumers like you.






Question 2. Compared to other issues that the government might address—such as public safety, education and health—how important is protecting aquatic ecosystems to you? Check one box.



Not Important



Somewhat Important




Very Important







Protecting aquatic ecosystems is

1

2

3

4

5







The government needs to know whether households are willing to pay the costs of these new policies.


This survey will ask you to compare policies with different effects on cooling water use, fish, and costs to your household. You will be asked to vote for the options you prefer.


You will also have the opportunity to support the current situation, with no new policies, and no new costs to your household.


6


This survey is similar to a public vote



7

The next part of this survey will ask you to consider different types of policies to protect fish, and indicate how you would vote. Effects of each possible policy will be described using the following scores:


Effect of Policy


What It Means

Commercial Fish Populations

(Fish Used by People)


A score between 0 and 100 percent showing the overall health of commercial and recreational fish populations. Higher scores mean more fish and greater fishing potential. A score of 100 means that these fish populations are at a size that maximizes long-term harvest; 0 means no harvest. The current score in Northeast waters is 42.


Fish Populations

(All Fish)


A score between 0 and 100 percent showing the estimated size of all fish populations compared to natural levels without human influence. A score of 100 means that populations are the largest natural size possible; 0 means no fish. The current score in Northeast waters is

26.

Fish Saved

(per Year)


A score between 0 and 100 percent showing the reduction in young fish lost compared to current levels. A score of 100 would mean that no fish are lost in cooling water intakes (all fish would be saved because of the new policy). The current score in Northeast waters is 0. This represents the status quo (no policy) with about 12% of plants already using advanced cooling systems.

Condition of

Aquatic Ecosystems


A score between 0 to 100 percent showing the ecological condition of affected areas, compared to the most natural waters in the Northeast. The score is determined by many factors including water quality and temperature, the health of aquatic species, and habitat conditions. Higher scores mean the area is more natural. The current score in Northeast waters is 50.

$

Cost per Year


How much the policy will cost your household, in unavoidable price increases for products and services you buy, including electricity and common household products.



How Would you rate tHE IMPoRTANCE OF TheSE EFFECTS?

Question 3. When considering policies that affect how facilities use cooling water, how important to you are effects on each of the following scores? Check one box for each. (For reminders of what the scores mean, please see page 7).



Not Important



Somewhat Important




Very Important







  1. Effect on commercial fish populations

1

2

3

4

5

  1. Effect on the fish populations (for all fish)

1

2

3

4

5

  1. Effect on fish saved

1

2

3

4

5

  1. Effect on the condition of aquatic ecosystems

1

2

3

4

5

  1. Effect on cost to my household

1

2

3

4

5





The next questions will ask you to choose between different policy options that would affect fish losses in cooling water systems. You will be given choices and asked to vote for the choice you prefer by checking the appropriate box. Questions will look similar to the sample on the next page.

8



SAMPLE QUESTION
Questions will look like the sample below.


Policy Effect


Current Situation
(No policy)

Option A

Option B

Commercial Fish Populations

(in 3-5 Years)


42%

(100% is populations that allow for maximum harvest)

45%

(100% is populations that allow for maximum harvest)

48%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


26%

(100% is populations without human influence)

27%

(100% is populations without human influence)

28%

(100% is populations without human influence)

Fish Saved per Year

(Out of 1.1 billion fish lost in water intakes)


0%

No change in status quo

5%

<0.1 billion fish saved

50%

0.6 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


50%

(100% is pristine condition)

51%

(100% is pristine condition)

52%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$36

per year

($3 per month)


$72

per year

($6 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


X


I would vote for

NO POLICY

X


I would vote for

OPTION A

X


I would vote for

OPTION B

If you prefer Option A, check this box

If you do not want A or B, check this box

If you prefer Option B,
check this box


If do not want A or B, check this box.

9



As you vote please remember


  • The map below shows the facilities and areas that would be affected by the proposed policies.

  • The policy options (A and B) given to you each require a different mix of advanced filters and closed cycle cooling in different areas, so effects on fish are different.

  • You will be shown different questions, with different combinations of technology and different costs

  • Depending on the policies chosen, costs to your household could range from $0 per year to a maximum of $72 per year (from $0 per month to a maximum of $6 per month).

  • Depending on the type of technology required and other factors, effects on fish and ecosystems may be different—even if the annual reduction in fish losses is similar.

  • Consider each pair of policy options separately—do not add them up or compare programs from different pages.

  • Scientists expect that effects on the environment and economy not shown explicitly will be small. For example, studies of industry suggest that effects on employment will be close to zero.

  • Y our votes are important. Answer all questions as if this were a real, binding vote.





10


Question 4. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

NE Waters


Current Situation
(No policy)

Option A

NE Waters

Option B

NE Waters

Commercial Fish Populations

(in 3-5 Years)


42%

(100% is populations that allow for maximum harvest)

45%

(100% is populations that allow for maximum harvest)

48%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


26%

100% is populations without human influence)

30%

(100% is populations without human influence)

27%

(100% is populations without human influence)

Fish Saved per Year

(Out of 1.1 billion fish lost in water intakes)


0%

No change in status quo

5%

<0.1 billion fish saved

5%

<0.1 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


50%

(100% is pristine condition)

52%

(100% is pristine condition)

54%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$48

per year

($4 per month)


$48

per year

($4 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B

11

11



POLICIES COULD REQUIRE DIFFERENT COMBINATIONS OF TECHNOLOGY


Now you will be asked to consider a new set of policy options for Northeast waters. As you vote, please remember—

  • Questions 5 and 6 present new sets of policy options. These options require a different mix of technologies in different areas.

  • Each question is a separate vote. Questions 5 and 6 cannot be directly compared to each other, or to Question 4.

  • Do not add up effects or costs across different questions.

  • Policy costs and effects depend on many factors. Saving more fish does not necessarily mean that all effects will improve.

12


Question 5. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

NE Waters


Current Situation
(No policy)

Option A

NE Waters

Option B

NE Waters

Commercial Fish Populations

(in 3-5 Years)


42%

(100% is populations that allow for maximum harvest)

48%

(100% is populations that allow for maximum harvest)

48%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


26%

(100% is populations without human influence)

28%

(100% is populations without human influence)

30%

(100% is populations without human influence)

Fish Saved per Year

(Out of 1.1 billion fish lost in water intakes)


0%

No change in status quo

50%

0.6 billion fish saved

95%

0.8 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


50%

(100% is pristine condition)

51%

(100% is pristine condition)

52%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$72

per year

($6 per month)


$60

per year

($5 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B


13


Question 6. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

NE Waters


Current Situation
(No policy)

Option A

NE Waters

Option B

NE Waters

Commercial Fish Populations

(in 3-5 Years)


42%

(100% is populations that allow for maximum harvest)

48%

(100% is populations that allow for maximum harvest)

45%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


26%

(100% is populations without human influence)

27%

(100% is populations without human influence)

27%

(100% is populations without human influence)

Fish Saved per Year

(Out of 1.1 billion fish lost in water intakes)


0%

No change in status quo

50%

0.6 billion fish saved

50%

0.6 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


50%

(100% is pristine condition)

52%

(100% is pristine condition)

52%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$72

per year

($6 per month)


$12

per year

($1 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B



14



Question 7. If you always voted for NO POLICY in questions 4-6, what was the primary reason? Check one. (Skip this question if you voted for Option A or B in any question above.)

____ The cost to my household was too high

____ Preventing fish losses is not important to me

____ I do not trust the government to fix the problem

____ I would rather spend my money on other things

____ I did not believe the choices were realistic

____ Since the problem was created by private facilities, they should fix it without passing costs on to consumers


Question 8. Indicate how strongly you agree with the following statements about questions 4 - 6 and the information provided. Check one box for each.



S trongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The survey provided enough information for me to make informed choices

1

2

3

4

5

I feel confident about my answers

1

2

3

4

5

Information in the survey was easy for me to understand

1

2

3

4

5

Information in the survey was fair and unbiased

1

2

3

4

5

Questions were easy for me to answer

1

2

3

4

5

I would vote the same way in an actual public vote

1

2

3

4

5

The effect of the proposed policies depends on many factors

1

2

3

4

5

Future ecological conditions are never 100% guaranteed

1

2

3

4

5

15


Question 9. How much did the following factors affect your answers to questions 4 – 6? Check one box for each row.






Effect on my answers to questions 4-6


Very Small Effect


Moderate Effect


Very Large Effect


.






Wanting to reduce taxes or costs to my household.

1

2

3

4

5

Wanting to prevent the loss of industrial jobs.

1

2

3

4

5

Wanting to preserve fish for commercial fishing.

1

2

3

4

5

Wanting to send a message that all environmental issues are important regardless of cost.

1

2

3

4

5

Wanting to preserve fish for recreation (fishing, etc.).

1

2

3

4

5

Wanting to preserve fish to benefit aquatic ecosystems.

1

2

3

4

5

Wanting to know that fish exist in local lakes, rivers and bays.

1

2

3

4

5

Wanting to pay my fair share for government programs.

1

2

3

4

5

Wanting to sustain the competitiveness of US business

1

2

3

4

5

Wanting to preserve fish as a source of food for people.

1

2

3

4

5

Wanting to preserve fish and ecosystems for future generations.

1

2

3

4

5

16




Question 10. How many days did you participate in the following during the last year? For trips longer than one day, please count each day separately. Check one box for each row.






Number of days you did the activity during the past year


0

1-5

6-10

11-15

16+







Boating / Canoeing / Kayaking

1

2

3

4

5

Swimming / Going to the Beach

1

2

3

4

5

Recreational Fishing (Fresh Water)

1

2

3

4

5

Recreational Fishing (Salt Water)

1

2

3

4

5

Shellfishing / Crabbing

1

2

3

4

5

Scuba Diving / Snorkeling

1

2

3

4

5





Question 11. Do you consume commercially caught fish or seafood? Yes No


Do you consume recreationally caught fish or seafood? Yes No





19

17


The following questions ensure that all groups are fairly represented. All answers are kept confidential to the extent provided by law.

Thank you for your participation in this important survey!




  1. What is your age? years

  2. What is your gender? Male Female

  3. What is the highest level of education that you have completed?

Less than high school One or more years of college

High school or equivalent Bachelor’s Degree

High school + technical school Graduate Degree

  1. How many people live in your household?

  2. How many of these people are 16 years of age or older? ____

  3. How many of these people are 6 years of age or younger? ____

  4. What is your zip code?

  5. Are you currently employed? Yes No

  6. Are you currently employed in the commercial fish industry? Yes No

  7. Are you of Hispanic or Latino origin? Yes No

  8. Which of the following racial categories describes you? You may select more than one.

American Indian or Alaskan Native Asian

Black or African American White

Native Hawaiian or Other Pacific Islander

  1. What category comes closest to your total household income?

    Less than $10,000

    $60,000 to $79,999

    $10,000 to $19,999

    $80,000 to $99,999

    $20,000 to $39,999

    $100,000 to $249,999

    $40,000 to $59,999

    $250,000 or more

  2. I f you have any comments on this survey, please write them below:



Attachment 2: Full Text of National Stated Preference Survey Component

OMB Control No. 2040-XXXX

Approval expires XX/XX/XX





Fish and Aquatic Habitat

A Survey of US Households

The public reporting and recordkeeping burden for this collection of information is estimated to average 30 minutes per response. Send comments on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including through the use of automated collection techniques to the Director, Collection Strategies Division, U.S. Environmental Protection Agency (2822T), 1200 Pennsylvania Ave., NW, Washington, D.C. 20460. Include the OMB control number in any correspondence. Do not send the completed survey to this address.



Human Activities, Aquatic Habitat and Fish

This survey asks for your opinions regarding policies that would affect fish and habitat in the U.S. Your answers will help the government decide which policies will be enacted. Background information in this survey was provided by the National Marine Fisheries Service, U.S. Environmental Protection Agency, U.S. Geological Survey and other state and federal offices.


Northeast fresh and salt waters support billions of fish. These include fish that are used by humans, as well as forage fish that are not used by humans, but serve as food for larger fish, birds, and animals.

Food Web

Birds/Mammals

Fish

Micro-organisms



This survey concerns proposed policies that would reduce fish losses caused by cooling water use by industrial facilities, including factories and power plants. These policies would benefit aquatic ecosystems but would increase the costs of some goods and services you buy, including electricity and common household products.

1

1



How Does Cooling Water Affect Fish?

The water that industrial facilities use to cool equipment is pumped from bays, rivers, and lakes. The largest amount is used by power plants that produce electricity.


Cooling Water Intake Screen


How Fish Are Affected by Water Intake


Cooling water use affects fresh and salt waters throughout the US (63% of all fish losses are in salt water coastal bays, estuaries, and tidal rivers and 37% in fresh water).



2



what kinds of Fish are Affected?



Cooling water use is not the largest cause of fish loss in most areas (fishing causes greater losses), but has affected some fish populations.



About 1/3 of the fish lost are species caught by commercial and recreational fishermen. Examples include striped bass, flounder, and cod.



The other 2/3 of the fish lost are forage species not caught by humans but serve as food for larger fish, birds, and animals. Examples include killifish, silverside, and stickleback.


Question 1. When thinking about how industrial facilities use cooling water, please rate the importance of the following to you. Check one box for each.



Not Important



Somewhat Important




Very Important







  1. Preventing the loss of fish that are caught by humans

1

2

3

4

5

  1. Preventing the loss of fish that are not caught by humans

1

2

3

4

5

  1. Maintaining the ecological health of rivers, lakes and bays

1

2

3

4

5

  1. Keeping the cost of goods and services low

1

2

3

4

5

  1. Making sure there is enough government regulation of industry

1

2

3

4

5

  1. Making sure there is not too much government regulation of industry

1

2

3

4

5



3

3



how many fish are affected?




After accounting for the number of eggs and larvae that would be expected to survive to adulthood, scientists estimate that the equivalent of about 2.5 billion young adult fish (the equivalent of one year old) are lost each year in U.S. coastal and fresh waters due to cooling water use.


Scientists can predict the number of these fish that will be saved under different policies. This number ranges from 0.6 to 2.4 billion fish saved per year.


For commercial fish species, losses of young fish in cooling water intakes vary by species, from the equivalent of less than 0.1% to about 10% of total populations.


Scientists expect the yearly effects on other fish species are in the same 0.1% to 10% range. The number of young fish lost in cooling water intakes relative to the total number of fish in the water is relatively high for some species, but low for others.


Largest Commercial Fish Losses per Year


Although scientists can predict the number of fish saved each year, the effect on fish populations is uncertain. This is because scientists do not know the total number of all fish in U.S. waters and because many factors – such as cooling water use, fishing, pollution and water temperature – affect fish.


The following page provides information on policies that would be required to reduce these fish losses.

Smaller effect on Striped Bass

Larger effect on Winter Flounder

4


Smaller effect on Striped Bass

Larger effect on Winter Flounder

4

4



New Regulations are Being Proposed to protect fish







Advanced filters and closed cycle cooling are already in use at many facilities and are proven technologies. New regulations would require a mix of advanced filters and closed cycle cooling at all facilities—with reductions in fish losses between 25% and 95%.


5



How important are these issues to you?



While these policies would reduce fish losses, they would also increase the costs of producing many goods and services — these costs would be passed on to consumers like you.






Question 2. Compared to other issues that the government might address—such as public safety, education and health—how important is protecting aquatic ecosystems to you? Check one box.



Not Important



Somewhat Important




Very Important







Protecting aquatic ecosystems is

1

2

3

4

5







The government needs to know whether households are willing to pay the costs of these new policies.


This survey will ask you to compare policies with different effects on cooling water use, fish, and costs to your household. You will be asked to vote for the options you prefer.


You will also have the opportunity to support the current situation, with no new policies, and no new costs to your household.


6


This survey is similar to a public vote



7

The next part of this survey will ask you to consider different types of policies to protect fish, and indicate how you would vote. Effects of each possible policy will be described using the following scores:


Effect of Policy


What It Means

Commercial Fish Populations

(Fish Used by People)


A score between 0 and 100 percent showing the overall health of commercial and recreational fish populations. Higher scores mean more fish and greater fishing potential. A score of 100 means that these fish populations are at a size that maximizes long-term harvest; 0 means no harvest. The current score in U.S. waters is 51.


Fish Populations

(All Fish)


A score between 0 and 100 percent showing the estimated size of all fish populations compared to natural levels without human influence. A score of 100 means that populations are the largest natural size possible; 0 means no fish. The current score in U.S. waters is 30.

Fish Saved

(per Year)


A score between 0 and 100 percent showing the reduction in young fish lost compared to current levels. A score of 100 would mean that no fish are lost in cooling water intakes (all fish would be saved because of the new policy). The current score in U.S. waters is 0. This represents the status quo (no policy) with about 18% of plants already using advanced cooling systems.

Condition of

Aquatic Ecosystems


A score between 0 to 100 percent showing the ecological condition of affected areas, compared to the most natural waters in the U.S.. The score is determined by many factors including water quality and temperature, the health of aquatic species, and habitat conditions. Higher scores mean the area is more natural. The current score in U.S. waters is 53.

$

Cost per Year


How much the policy will cost your household, in unavoidable price increases for products and services you buy, including electricity and common household products.



How Would you rate tHE IMPoRTANCE OF TheSE EFFECTS?

Question 3. When considering policies that affect how facilities use cooling water, how important to you are effects on each of the following scores? Check one box for each. (For reminders of what the scores mean, please see page 7).



Not Important



Somewhat Important




Very Important







  1. Effect on commercial fish populations

1

2

3

4

5

  1. Effect on the fish populations (for all fish)

1

2

3

4

5

  1. Effect on fish saved

1

2

3

4

5

  1. Effect on the condition of aquatic ecosystems

1

2

3

4

5

  1. Effect on cost to my household

1

2

3

4

5





The next questions will ask you to choose between different policy options that would affect fish losses in cooling water systems. You will be given choices and asked to vote for the choice you prefer by checking the appropriate box. Questions will look similar to the sample on the next page.

8



SAMPLE QUESTION
Questions will look like the sample below.


Policy Effect


Current Situation
(No policy)

Option A

Option B

Commercial Fish Populations

(in 3-5 Years)


51%

(100% is populations that allow for maximum harvest)

54%

(100% is populations that allow for maximum harvest)

57%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


30%

(100% is populations without human influence)

34%

(100% is populations without human influence)

31%

(100% is populations without human influence)

Fish Saved per Year

(Out of 2.2 billion fish lost in water intakes)


0%

No change in status quo

25%

0.6 billion fish saved

25%

0.6 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


53%

(100% is pristine condition)

55%

(100% is pristine condition)

57%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$48

per year

($4 per month)


$48

per year

($4 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


X


I would vote for

NO POLICY

X


I would vote for

OPTION A

X


I would vote for

OPTION B

If you prefer Option A, check this box

If you do not want A or B, check this box

If you prefer Option B,
check this box


If do not want A or B, check this box.

9



As you vote please remember


  • The map below shows the facilities and areas that would be affected by the proposed policies.

  • The policy options (A and B) given to you each require a different mix of advanced filters and closed cycle cooling in different areas, so effects on fish are different.

  • You will be shown different questions, with different combinations of technology and different costs

  • Depending on the policies chosen, costs to your household could range from $0 per month to a maximum of $6 per month.

  • Depending on the policies chosen, costs to your household could range from $0 per year to a maximum of $72 per year (from $0 per month to a maximum of $6 per month).

  • Consider each pair of policy options separately—do not add them up or compare programs from different pages.

  • Scientists expect that effects on the environment and economy not shown explicitly will be small. For example, studies of industry suggest that effects on employment will be close to zero.

  • Y our votes are important. Answer all questions as if this were a real, binding vote.






10


Question 4. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

US


Current Situation
(No policy)

Option A

US

Option B

US

Commercial Fish Populations

(in 3-5 Years)


51%

(100% is populations that allow for maximum harvest)

54%

(100% is populations that allow for maximum harvest)

57%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


30%

100% is populations without human influence)

34%

(100% is populations without human influence)

31%

(100% is populations without human influence)

Fish Saved per Year

(Out of 2.5 billion fish lost in water intakes)


0%

No change in status quo

25%

0.6 billion fish saved

25%

0.6 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


53%

(100% is pristine condition)

55%

(100% is pristine condition)

57%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$48

per year

($4 per month)


$48

per year

($4 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B

11

11



POLICIES COULD REQUIRE DIFFERENT COMBINATIONS OF TECHNOLOGY


Now you will be asked to consider a new set of policy options for U.S. waters. As you vote, please remember—

  • Questions 5 and 6 present new sets of policy options. These options require a different mix of technologies in different areas.

  • Each question is a separate vote. Questions 5 and 6 cannot be directly compared to each other, or to Question 4.

  • Do not add up effects or costs across different questions.

  • Policy costs and effects depend on many factors. Saving more fish does not necessarily mean that all effects will improve.

12


Question 5. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

US


Current Situation
(No policy)

Option A

US

Option B

US

Commercial Fish Populations

(in 3-5 Years)


51%

(100% is populations that allow for maximum harvest)

57%

(100% is populations that allow for maximum harvest)

57%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


30%

(100% is populations without human influence)

32%

(100% is populations without human influence)

34%

(100% is populations without human influence)

Fish Saved per Year

(Out of 2.5 billion fish lost in water intakes)


0%

No change in status quo

55%

1.4 billion fish saved

95%

2.4 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


53%

(100% is pristine condition)

54%

(100% is pristine condition)

55%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$72

per year

($6 per month)


$60

per year

($5 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B


13


Question 6. Assume that Options A and B would require a different mix of filters and closed cycle cooling in different areas. Assume all types of fish are affected. How would you vote?




Policy Effect

US


Current Situation
(No policy)

Option A

US

Option B

US

Commercial Fish Populations

(in 3-5 Years)


51%

(100% is populations that allow for maximum harvest)

57%

(100% is populations that allow for maximum harvest)

54%

(100% is populations that allow for maximum harvest)

Fish Populations

(all fish)

(in 3-5 Years)


30%

(100% is populations without human influence)

31%

(100% is populations without human influence)

31%

(100% is populations without human influence)

Fish Saved per Year

(Out of 2.5 billion fish lost in water intakes)


0%

No change in status quo

55%

1.4 billion fish saved

55%

1.4 billion fish saved

Condition of Aquatic Ecosystems

(in 3-5 Years)


53%

(100% is pristine condition)

55%

(100% is pristine condition)

54%

(100% is pristine condition)

$

Increase in Cost of Living for Your Household


$0

No cost increase



$72

per year

($6 per month)


$12

per year

($1 per month)


HOW WOULD YOU VOTE?

(CHOOSE ONE ONLY)


I would vote for

NO POLICY

I would vote for

OPTION A

I would vote for

OPTION B



14



Question 7. If you always voted for NO POLICY in questions 4-6, what was the primary reason? Check one. (Skip this question if you voted for Option A or B in any question above.)

____ The cost to my household was too high

____ Preventing fish losses is not important to me

____ I do not trust the government to fix the problem

____ I would rather spend my money on other things

____ I did not believe the choices were realistic

____ Since the problem was created by private facilities, they should fix it without passing costs on to consumers


Question 8. Indicate how strongly you agree with the following statements about questions 4 - 6 and the information provided. Check one box for each.



S trongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The survey provided enough information for me to make informed choices

1

2

3

4

5

I feel confident about my answers

1

2

3

4

5

Information in the survey was easy for me to understand

1

2

3

4

5

Information in the survey was fair and unbiased

1

2

3

4

5

Questions were easy for me to answer

1

2

3

4

5

I would vote the same way in an actual public vote

1

2

3

4

5

The effect of the proposed policies depends on many factors

1

2

3

4

5

Future ecological conditions are never 100% guaranteed

1

2

3

4

5

15


Question 9. How much did the following factors affect your answers to questions 4 – 6? Check one box for each row.






Effect on my answers to questions 4-6


Very Small Effect


Moderate Effect


Very Large Effect


.






Wanting to reduce taxes or costs to my household.

1

2

3

4

5

Wanting to prevent the loss of industrial jobs.

1

2

3

4

5

Wanting to preserve fish for commercial fishing.

1

2

3

4

5

Wanting to send a message that all environmental issues are important regardless of cost.

1

2

3

4

5

Wanting to preserve fish for recreation (fishing, etc.).

1

2

3

4

5

Wanting to preserve fish to benefit aquatic ecosystems.

1

2

3

4

5

Wanting to know that fish exist in local lakes, rivers and bays.

1

2

3

4

5

Wanting to pay my fair share for government programs.

1

2

3

4

5

Wanting to sustain the competitiveness of US business

1

2

3

4

5

Wanting to preserve fish as a source of food for people.

1

2

3

4

5

Wanting to preserve fish and ecosystems for future generations.

1

2

3

4

5

16




Question 10. How many days did you participate in the following during the last year? For trips longer than one day, please count each day separately. Check one box for each row.






Number of days you did the activity during the past year


0

1-5

6-10

11-15

16+







Boating / Canoeing / Kayaking

1

2

3

4

5

Swimming / Going to the Beach

1

2

3

4

5

Recreational Fishing (Fresh Water)

1

2

3

4

5

Recreational Fishing (Salt Water)

1

2

3

4

5

Shellfishing / Crabbing

1

2

3

4

5

Scuba Diving / Snorkeling

1

2

3

4

5






Question 11. Do you consume commercially caught fish or seafood? Yes No


Do you consume recreationally caught fish or seafood? Yes No





19

17


The following questions ensure that all groups are fairly represented. All answers are kept confidential to the extent provided by law.



  1. What is your age? years

  2. What is your gender? Male Female

  3. What is the highest level of education that you have completed?

Less than high school One or more years of college

High school or equivalent Bachelor’s Degree

High school + technical school Graduate Degree

  1. How many people live in your household?

  2. How many of these people are 16 years of age or older? ____

  3. How many of these people are 6 years of age or younger? ____

  4. What is your zip code?

  5. Are you currently employed? Yes No

  6. Are you currently employed in the commercial fish industry? Yes No

  7. Are you of Hispanic or Latino origin? Yes No

  8. Which of the following racial categories describes you? You may select more than one.

American Indian or Alaskan Native Asian

Black or African American White

Native Hawaiian or Other Pacific Islander

  1. What category comes closest to your total household income?

    Less than $10,000

    $60,000 to $79,999

    $10,000 to $19,999

    $80,000 to $99,999

    $20,000 to $39,999

    $100,000 to $249,999

    $40,000 to $59,999

    $250,000 or more

  2. If you have any comments on this survey, please write them below:


18



Attachment 3: First Federal Register Notice

ENVIRONMENTAL PROTECTION AGENCY


[EPA-HQ-OW-2010-0595; FRL-]


Agency Information Collection Activities; Proposed Collection; Comment Request; Willingness to Pay Survey for §316(b) Existing Facilities Cooling Water Intake Structures (New), EPA ICR No. 2402.01, OMB Control No. 2040-NEW


AGENCY: Environmental Protection Agency (EPA).


ACTION: Notice.


SUMMARY: In compliance with the Paperwork Reduction Act (PRA) (44 U.S.C. 3501 et seq.), this document announces that EPA is planning to submit a request for a new Information Collection Request (ICR) to the Office of Management and Budget (OMB). Before submitting the ICR to OMB for review and approval, EPA is soliciting comments on specific aspects of the proposed information collection as described below.

DATES: Comments must be submitted on or before [insert date 60 days after publication in the Federal Register].

ADDRESSES: Submit your comments, identified by Docket ID No. EPA-HQ-OW-2010-0595 by one of the following methods:

  • www.regulations.gov: Follow the on-line instructions for submitting comments.

• Email: [email protected], Attention Docket ID No. EPA-HQ-OW-2010-0595

  • Mail: Water Docket, Environmental Protection Agency, Mailcode: 28221T, 1200 Pennsylvania Ave., NW., Washington, DC 20460, Attention Docket ID No. EPA-HQ-OW-2010-0595. Please include a total of 3 copies.

  • Hand Delivery: Water Docket, EPA Docket Center, EPA West, Room 3334, 1301 Constitution Ave., NW., Washington, DC, Attention Docket ID No. EPA-HQ-OW-2010-0595. Such deliveries are only accepted during the Docket’s normal hours of operation and special arrangements should be made.

Instructions: Direct your comments to Docket ID No. EPA-HQ-OW-2010-0595. EPA's policy is that all comments received will be included in the public docket without change and may be made available online at www.regulations.gov, including any personal information provided, unless the comment includes information claimed to be Confidential Business Information (CBI) or other information whose disclosure is restricted by statute. Do not submit information that you consider to be CBI or otherwise protected through www.regulations.gov or e-mail. The www.regulations.gov website is an “anonymous access” system, which means EPA will not know your identity or contact information unless you provide it in the body of your comment. If you send an e-mail comment directly to EPA without going through www.regulations.gov your e-mail address will be automatically captured and included as part of the comment that is placed in the public docket and made available on the Internet. If you submit an electronic comment, EPA recommends that you include your name and other contact information in the body of your comment and with any disk or CD-ROM you submit. If EPA cannot read your comment due to technical difficulties and cannot contact you for clarification, EPA may not be able to consider your comment. Electronic files should avoid the use of special characters, any form of encryption, and be free of any defects or viruses. For additional information about EPA’s public docket visit the EPA Docket Center homepage at http://www.epa.gov/dockets.

FOR FURTHER INFORMATION CONTACT: Erik Helm, Office of Water, Office of Science and Technology, Engineering and Analysis Division, Economic and Environmental Assessment Branch, 4303T, Environmental Protection Agency, 1200 Pennsylvania Ave., NW, Washington, DC 20460; telephone number: 202-566-1049; fax number: 202-566-1053; email address: [email protected].

SUPPLEMENTARY INFORMATION:

How Can I Access the Docket and/or Submit Comments?

EPA has established a public docket for this ICR under Docket ID No. EPA-HQ-OW-2010-0595 which is available for online viewing at www.regulations.gov, or in person viewing at the Water Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Ave., NW, Washington, DC. The EPA/DC Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Reading Room is 202-566-1744, and the telephone number for the Water Docket is 202-566-1752.

Use www.regulations.gov to obtain a copy of the draft collection of information, submit or view public comments, access the index listing of the contents of the docket, and to access those documents in the public docket that are available electronically. Once in the system, select “search,” then key in the docket ID number identified in this document.

What Information is EPA Particularly Interested in?

Pursuant to section 3506(c)(2)(A) of the PRA, EPA specifically solicits comments and information to enable it to:

(i) evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the Agency, including whether the information will have practical utility;

(ii) evaluate the accuracy of the Agency's estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used;

(iii) enhance the quality, utility, and clarity of the information to be collected; and

(iv) minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses. In particular, EPA is requesting comments from very small businesses (those that employ less than 25) on examples of specific additional efforts that EPA could make to reduce the paperwork burden for very small businesses affected by this collection.

What Should I Consider when I Prepare My Comments for EPA?

You may find the following suggestions helpful for preparing your comments:

1. Explain your views as clearly as possible and provide specific examples.

2. Describe any assumptions that you used.

3. Provide copies of any technical information and/or data you used that support your views.

4. If you estimate potential burden or costs, explain how you arrived at the estimate that you provide.

5. Offer alternative ways to improve the collection activity.

6. Make sure to submit your comments by the deadline identified under DATES.

7. To ensure proper receipt by EPA, be sure to identify the docket ID number assigned to this action in the subject line on the first page of your response. You may also provide the name, date, and Federal Register citation.

What Information Collection Activity or ICR Does this Apply to?

Affected entities: Entities potentially affected by this action are individuals/households.

Title: Willingness to Pay Survey for Section 316(b) Existing Facilities Cooling Water Intake Structures: Instrument, Pre-test, and Implementation (New)

ICR numbers: EPA ICR No. 2402.01, OMB Control No. 2040-NEW.

ICR status: This ICR is for a new information collection activity. An Agency may not conduct or sponsor, and a person is not required to respond to, a collection of information, unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations in title 40 of the CFR, after appearing in the Federal Register when approved, are listed in 40 CFR part 9, are displayed either by publication in the Federal Register or by other appropriate means, such as on the related collection instrument or form, if applicable.  The display of OMB control numbers in certain EPA regulations is consolidated in 40 CFR part 9.

Abstract: Section 316(b) of the Clean Water Act (CWA) requires EPA to ensure that the location, design, construction, and capacity of cooling water intake structures (CWIS) reflect the best technology available (BTA) to protect aquatic organisms from being killed or injured by impingement or entrainment. EPA divided this rulemaking into three phases. At question here are the Phases II and III.

The Phase II rule, which covered existing electric generating plants that withdraw at least 50 million gallons a day (MGD) of cooling water, was completed in July 2004. Industry and environmental stakeholders challenged the Phase II regulations. On judicial review, the Second Circuit remanded several key provisions. In July 2007, EPA suspended the Phase II Rule. Following additional review in 2009 by the U.S. Supreme Court in Entergy Corp. v. Riverkeeper Inc., which decided that “EPA permissibly relied on cost-benefit analysis in setting the national performance standards … as part of the Phase II regulations.” EPA has voluntary remanded the rule.

In June of 2006, EPA promulgated the 316(b) Phase III Rule for existing manufacturers, small flow power plants (facilities that withdraw less than 50 MGD), and new offshore oil and gas facilities. Offshore oil and gas firms and environmental groups petitioned for judicial review, which was to occur in the Fifth Circuit, but was stayed pending the completion of the Phase II litigation. EPA has asked the Fifth Circuit to remand the existing facilities portion of the Phase III rule so that it can consider what might be appropriate requirements for all existing facilities. While the 5th Circuit has not yet issued a decision, EPA is anticipating combining Phases II and III into one rulemaking covering all existing facilities.

Under Executive Order 12866, EPA is required to estimate the potential benefits and costs to society of proposed rule options. To assess the public policy significance or importance of the ecological gains from the section 316(b) regulation for existing facilities, EPA requests approval from the Office of Management and Budget to conduct a stated preference survey. Data from the associated stated preference survey will be used to estimate values (willingness to pay, or WTP) derived by households for changes related to the reduction of fish losses at CWIS, and to provide information to assist in the interpretation and validation of survey responses. EPA has designed the survey to provide data to support the following specific objectives: [a] the estimation of the total values (use plus non-use) that individuals place on preventing losses of fish and other aquatic organisms caused by 316(b) facilities; [b] to understand how much individuals value preventing fish losses, increasing fish populations, and increasing commercial and recreational catch rates; [c] to understand how such values depend on the current baseline level of fish populations and fish losses, the scope of the change in those measures, and the certainty level of the predictions; and [d] to understand how such values vary with respect to individuals’ economic and demographic characteristics.

The target population for this stated preference survey is all individuals from continental U.S. households who are 18 years of age or older. The population of households will be stratified by the geographic boundaries of 5 EPA study regions: California, Great Lakes, Inland, Northeast, and Southeast. Survey participants will be recruited randomly through random digit dialing. The intended sample size for the survey is 2,000 households including only households providing completed surveys. This sample size was chosen to provide statistically robust results while minimizing the cost and burden of the survey. In addition to the sample size, EPA will take steps to both test for and ameliorate survey non-response bias. EPA will follow standard practice in stated preference design, including the extensive use of focus groups and pretesting to develop survey questionnaires.

The key elicitation questions in each of the five regional surveys ask respondents whether or not they would vote for policies that would increase their cost of living, in exchange for specified multi-attribute changes in [a] impingement and entrainment losses of fish, [b] commercial fish sustainability, [c] long-term fish populations, and [d] condition of aquatic ecosystems. This “choice experiment” or “choice modeling” framework allows respondents to state their preferences by making a voting-type selection between two hypothetical multi-attribute regulatory options (and a third “status quo” choice that rejects both options). These stated preferences with respect to levels of environmental goods and cost to households, when used in conjunction with other information collected in the survey on the respondent’s use of the affected aquatic resources, household income, and other demographics, can be analyzed statistically (using either a fixed or random effects mixed logit framework) to estimate total WTP for the quantified environmental benefits of the 316(b) existing facilities rulemaking. Data analysis and interpretation is grounded in a standard random utility model.

In addition, to the total values, the survey will allow the estimation of values associated with specific choice attributes (following standard methods for choice experiments), and will also allow the flexibility to provide some insight into the relative importance of use versus non-use values in the 316(b) context. Analysis also allows estimation of the variation in WTP across different types of households, in different areas. As indicated in prior literature, it is virtually impossible to justify, theoretically, the decomposition of empirical total willingness-to-pay estimates into separate use and non-use components. The survey will, however, provide the flexibility to estimate nonuser values, using various nonuser definitions drawn from responses to survey questions. The structure of the choice attribute questions will also allow the analysis to separate value components related to the most common sources of use values—effect on harvested recreational and commercial fish.

The various welfare values that can be derived from this stated preference survey (discussed above) along with those that are estimated apart from the survey effort will offer insight into the composition of the value people place on the 316(b) environmental impacts. But within rulemaking, among the most crucial concerns is the avoidance of benefit (or cost) double counting. Here, for example, WTP estimates derived from the survey may overlap—to a potentially substantial extent—with estimates that can be provided through some other methods. Therefore, particular care will be given to avoid any possible double counting of values that might be derived from alternative valuation methods. In doing so, the Office of Water will rely upon standard theoretical tools for non-market welfare analysis, as presented by authors including Freeman (2003) and Just et al. (2004).

Burden Statement: The annual public reporting and recordkeeping burden for this collection of information is estimated to average 5 minutes per telephone screening participant and 30 minutes per mail survey respondent including the time necessary to complete and mail back the questionnaire. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements which have subsequently changed; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.

The ICR provides a detailed explanation of the Agency’s estimate, which is only briefly summarized here:

Estimated total number of potential respondents: 8,333 for telephone screening and 2,000 for mailed questionnaires.

Frequency of response: one-time response.

Estimated total average number of responses for each respondent: one-time response.

Estimated total burden hours: 1,527 hours.

Estimated total costs: $ 34,600. EPA estimates that there will be no capital and operating and maintenance cost burden to respondents.

What is the Next Step in the Process for this ICR?

EPA will consider the comments received and amend the ICR as appropriate. The final ICR package will then be submitted to OMB for review and approval pursuant to 5 CFR 1320.12. At that time, EPA will issue another Federal Register notice pursuant to 5 CFR 1320.5(a)(1)(iv) to announce the submission of the ICR to OMB and the opportunity to submit additional comments to OMB. If you have any questions about this ICR or the approval process, please contact the technical person listed under FOR FURTHER INFORMATION CONTACT.


Dated: ________________

__________________________________

Ephraim S. King, Director

Office of Science and Technology




Attachment 4: Second Federal Register Notice



ENVIRONMENTAL PROTECTION AGENCY


[EPA-HQ-OW-2010-0595; FRL - ]


Agency Information Collection Activities; Submission to OMB for Review and Approval; Willingness to Pay Survey for §316(b) Existing Facilities Cooling Water Intake Structures (New), EPA ICR No. 2402.01, OMB Control No. 2040-NEW


AGENCY: Environmental Protection Agency (EPA).


ACTION: Notice.


SUMMARY: In compliance with the Paperwork Reduction Act (PRA) (44 U.S.C. 3501 et seq.), this document announces that an Information Collection Request (ICR) has been forwarded to the Office of Management and Budget (OMB) for review and approval. This is a request for a new collection. The ICR, which is abstracted below, describes the nature of the information collection and its estimated burden and cost.

DATES: Additional comments may be submitted on or before [insert date 30 days after publication in the Federal Register].

ADDRESSES: Submit your comments, identified by Docket ID No. EPA-HQ-OW-2010-0595 to:

(1) EPA by one of the following methods:

  • www.regulations.gov: Follow the on-line instructions for submitting comments.

  • Email: [email protected], Attention Docket ID No. EPA-HQ-OW-2010-0595

  • Mail: Water Docket, Environmental Protection Agency, Mailcode: 28221T, 1200 Pennsylvania Ave., NW., Washington, DC 20460, Attention Docket ID No. EPA-HQ-OW-2010-0595. Please include a total of 3 copies.

  • Hand Delivery: Water Docket, EPA Docket Center, EPA West, Room 3334, 1301 Constitution Ave., NW., Washington, DC, Attention Docket ID No. EPA-HQ-OW-2010-0595. Such deliveries are only accepted during the Docket’s normal hours of operation and special arrangements should be made.

Instructions: Direct your comments to Docket ID No. EPA-HQ-OW-2010-0595. EPA's policy is that all comments received will be included in the public docket without change and may be made available online at www.regulations.gov, including any personal information provided, unless the comment includes information claimed to be Confidential Business Information (CBI) or other information whose disclosure is restricted by statute. Do not submit information that you consider to be CBI or otherwise protected through www.regulations.gov or e-mail. The www.regulations.gov website is an “anonymous access” system, which means EPA will not know your identity or contact information unless you provide it in the body of your comment. If you send an e-mail comment directly to EPA without going through www.regulations.gov your e-mail address will be automatically captured and included as part of the comment that is placed in the public docket and made available on the Internet. If you submit an electronic comment, EPA recommends that you include your name and other contact information in the body of your comment and with any disk or CD-ROM you submit. If EPA cannot read your comment due to technical difficulties and cannot contact you for clarification, EPA may not be able to consider your comment. Electronic files should avoid the use of special characters, any form of encryption, and be free of any defects or viruses. For additional information about EPA’s public docket visit the EPA Docket Center homepage at http://www.epa.gov/dockets.

(2) OMB by mail to: Office of Information and Regulatory Affairs, Office of Management and Budget (OMB), Attention: Desk Officer for EPA, 725 17th Street, NW, Washington, DC 20503.


FOR FURTHER INFORMATION CONTACT: Erik Helm, Office of Water, Office of Science and Technology, Engineering and Analysis Division, Economic and Environmental Assessment Branch, 4303T, Environmental Protection Agency, 1200 Pennsylvania Ave., NW, Washington, DC 20460; telephone number: 202-566-1049; fax number: 202-566-1053; email address: [email protected].

SUPPLEMENTARY INFORMATION: EPA has submitted the following ICR to OMB for review and approval according to the procedures prescribed in 5 CFR 1320.12. On July 21, 2010 (74 FR 42438), EPA sought comments on this ICR pursuant to 5 CFR 1320.8(d). EPA received five comments during the comment period, which are addressed in the ICR. Any additional comments on this ICR should be submitted to EPA and OMB within 30 days of this notice.

EPA has established a public docket for this ICR under Docket ID No. EPA-HQ-OW-2010-0595 which is available for online viewing at www.regulations.gov, or in person viewing at the Water Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Ave., NW, Washington, DC. The EPA/DC Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Reading Room is 202-566-1744, and the telephone number for the Water Docket is 202-566-1752.

Use EPA’s electronic docket and comment system at www.regulations.gov, to submit or view public comments, access the index listing of the contents of the docket, and to access those documents in the docket that are available electronically. Once in the system, select “docket search,” then key in the docket ID number identified above. Please note that EPA’s policy is that public comments, whether submitted electronically or in paper, will be made available for public viewing at www.regulations.gov as EPA receives them and without change, unless the comment contains copyrighted material, CBI, or other information whose public disclosure is restricted by statute. For further information about the electronic docket, go to www.regulations.gov.

Title: Willingness to Pay Survey for Section 316(b) Existing Facilities Cooling Water Intake Structures: Instrument, Pre-test, and Implementation (New)

ICR numbers: EPA ICR No. 2402.01, OMB Control No. 2040-NEW.

ICR status: This ICR is for a new information collection activity. An Agency may not conduct or sponsor, and a person is not required to respond to, a collection of information, unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations in title 40 of the CFR, after appearing in the Federal Register when approved, are listed in 40 CFR part 9, are displayed either by publication in the Federal Register or by other appropriate means, such as on the related collection instrument or form, if applicable.  The display of OMB control numbers in certain EPA regulations is consolidated in 40 CFR part 9.

Abstract:

Section 316(b) of the Clean Water Act (CWA) requires EPA to ensure that the location, design, construction, and capacity of cooling water intake structures (CWIS) reflect the best technology available (BTA) to protect aquatic organisms from being killed or injured by impingement or entrainment. At question here is the regulation of the existing steam electric and manufacturing facilities.

Under Executive Order 12866, EPA is required to estimate the potential benefits and costs to society of proposed rule options of significant rules. To assess the importance of the ecological gains from the section 316(b) regulation, EPA requests approval from the OMB to conduct a stated preference survey. Data from the associated stated preference survey will be used to estimate values (willingness to pay, or WTP) derived by households for changes related to the reduction of fish losses at CWIS, and to provide information to assist in the interpretation and validation of survey responses. EPA has designed the survey to provide data to support the following specific objectives: [a] the estimation of the total values that individuals place on preventing losses of fish and other aquatic organisms caused by 316(b) facilities; [b] to understand how much individuals value preventing fish losses, increasing fish populations, and increasing commercial and recreational catch rates; [c] to understand how such values depend on the current baseline level of fish populations and fish losses, the scope of the change in those measures, and the certainty level of the predictions; and [d] to understand how such values vary with respect to individuals’ economic and demographic characteristics.

The target population for this stated preference survey is all individuals from continental U.S. households who are 18 years of age or older. The population of households will be stratified into four study regions: Northeast, Southeast, Inland, and Pacific. In addition, EPA will administer a national version of the survey that does not require stratification. Survey participants will be recruited through random digit dialing. The intended sample size for the mail survey is 2,288 households including only households providing completed surveys. This sample size was chosen to provide statistically robust results while minimizing the cost and burden of the survey. EPA will also take steps to both test for and ameliorate survey non-response bias. EPA has followed standard practice in stated preference design, including the extensive use of focus groups and pretesting to develop survey questionnaires.

The key elicitation questions ask respondents whether or not they would vote for policies that would increase their cost of living, in exchange for specified multi-attribute changes in [a] impingement and entrainment losses of fish, [b] commercial fish populations, [c] long-term populations of all fish, and [d] condition of aquatic ecosystems. The respondents’ stated preferences with respect to levels of environmental goods and cost to households, when used in conjunction with other information collected in the survey on the use of the affected aquatic resources, household income, and other demographics, can be analyzed statistically (using a mixed logit framework) to estimate total WTP for the quantified environmental benefits of the 316(b) rulemaking. Data analysis and interpretation is grounded in a standard random utility model.

The welfare values that can be derived from this stated preference survey along with those that are estimated apart from the survey effort will offer insight into the composition of the value people place on the 316(b) environmental impacts. WTP estimates derived from the survey may overlap - to a potentially substantial extent - with estimates that can be provided through some other methods. Therefore, particular care will be given to avoid any possible double counting of values that might be derived from alternative valuation methods.

Burden Statement: The annual public reporting and recordkeeping burden for this collection of information is estimated to average 5 minutes per telephone screening participant and 30 minutes per mail survey respondent including the time necessary to complete and mail back the questionnaire. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements which have subsequently changed; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.

The ICR provides a detailed explanation of the Agency’s estimate, which is only briefly summarized here:

Estimated total number of potential respondents: 9,533 for telephone screening and 2,288 for mailed questionnaires.

Frequency of response: one-time response.

Estimated total average number of responses for each respondent: one-time response.

Estimated total burden hours: 1,938 hours.

Estimated total costs: $ 39,583. EPA estimates that there will be no capital and operating and maintenance cost burden to respondents.


Dated: ________________

__________________________________

John Moses, Director, Collection Strategies Division.


Attachment 5: Description of Statistical Survey Design


The following represents an anticipated experimental design for survey implementation, along with the associated number of completed surveys that will be required. Part B of this supporting statement provides detail on the sampling design. The proposed design and sampling plan is based on standard design and sampling theory for choice experiments and population surveys, as outlined by Louviere et al. (2000), Kuhfeld (2009) and Dillman (2000). EPA notes that the anticipated experimental design described here is preliminary and it may be subject to refinements during design evaluations to account for issues such as dominant or dominated pairs, ecological feasibility, and to remove attribute combinations which do not provide information for estimation.

The purpose of the 316(b) survey is to calculate average per household parameters (e.g., willingness to pay and choice probabilities) within a given survey population. No sub-population estimates are required. The anticipated experimental design for the choice experiment includes two multi-attribute choice options or alternatives, A and B, together with a fixed status quo or “no policy” option. Options A and B are characterized by levels for the following five attributes:

  1. Fish Saved per Year in A and B (x1A; x1B) – 3 possible levels

  2. Commercial Fish Populations in A and B (x2A; x2B) – 3 possible levels

  3. Fish Populations (all fish) in A and B (x3A; x3B) – 3 possible levels

  4. Aquatic Ecosystem Condition in A and B (x4A; x5B) – 3 possible levels

  5. Cost in A and B (x5A; x5B) - 6 possible levels

This implies an experimental design characterized by [34×6] for each alternative, or [38×62] for alternatives A and B combined.

To construct a preliminary main effects design that is sufficiently flexible to estimate alternative specific main effects and response patterns (i.e., a non-generic design), we begin with the smallest available 100% efficient linear main effects plan for a full 38x62 design. This treats each attribute of each alternative as a separate design element. Elimination of duplicate profiles reduces efficiency to 99.3%. From this candidate design, an initial evaluation was conducted to identify dominant/dominated pairs. These were adjusted to eliminate dominance, most often by swapping one attribute level between Option A and B. These adjustments result in only minimal changes in orthogonality and other design properties. The result is a design with 72 profiles, with attributes labeled following the above notation, and levels indicated by integers 1...N, where N for each attribute is the number of levels identified above:


Table 1: Set of 72 Design Profiles

Version

Choice Question

x1A

x2A

x3A

x4A

x5A

x1B

x2B

x3B

x4B

x5B

1

1

1

2

3

2

4

1

3

1

3

4

1

2

2

3

2

1

6

3

3

3

2

5

1

3

2

3

1

2

6

2

2

1

1

1

2

1

1

3

3

2

4

2

1

2

2

1

2

2

2

2

2

3

2

2

3

1

1

3

2

3

1

1

1

1

3

3

2

1

3

2

3

1

1

3

3

1

4

1

3

1

3

6

3

2

3

2

2

3

5

3

2

2

2

1

3

3

1

1

2

3

1

2

1

3

3

2

4

1

3

2

2

3

6

1

1

3

1

6

4

2

3

3

3

2

3

2

2

2

3

3

4

3

2

2

1

1

5

3

1

1

2

4

5

1

2

3

1

2

2

2

2

3

3

4

5

2

1

2

1

1

3

1

1

2

1

3

5

3

2

1

3

3

1

1

2

2

1

5

6

1

3

1

2

2

3

1

1

3

3

6

6

2

3

3

1

3

5

3

3

2

2

3

6

3

2

1

2

1

4

2

2

1

1

2

7

1

2

1

3

3

4

3

1

3

1

1

7

2

3

3

2

1

2

1

2

1

3

5

7

3

1

2

1

1

3

2

3

2

2

6

8

1

1

2

3

1

6

3

2

3

3

3

8

2

3

1

2

1

5

2

3

2

1

6

8

3

2

2

2

2

5

1

1

2

2

2

9

1

1

3

1

3

6

2

3

2

1

4

9

2

2

1

3

1

2

3

2

3

2

6

9

3

3

1

2

2

1

3

1

1

3

4

10

1

2

3

3

1

5

2

1

3

1

2

10

2

3

2

2

3

4

3

2

3

2

4

10

3

2

1

2

2

6

3

3

2

3

3

11

1

1

2

2

1

1

3

3

3

2

5

11

2

3

3

3

3

5

1

1

1

3

2

11

3

2

1

3

2

3

1

2

2

1

1

12

1

3

3

2

1

3

2

3

3

1

4

12

2

1

1

1

3

5

3

2

1

3

5

12

3

2

1

3

2

6

2

1

1

2

6

13

1

3

3

1

3

2

1

3

3

2

2

13

2

2

2

1

1

1

3

1

2

3

4

13

3

1

2

3

2

5

1

2

3

1

5

14

1

2

2

3

3

3

3

3

2

3

2

14

2

1

3

2

2

5

3

2

1

2

6

14

3

3

3

1

1

4

2

1

3

1

5

15

1

1

1

1

3

3

2

2

3

2

1

15

2

2

2

1

2

4

1

3

1

2

2

15

3

3

2

3

1

6

2

1

2

3

5

16

1

1

1

2

3

4

2

3

2

3

5

16

2

2

3

3

3

4

1

2

2

1

3

16

3

1

2

1

2

6

3

1

3

1

2

17

1

3

2

1

3

4

3

2

2

1

6

17

2

3

2

1

2

2

2

2

3

3

1

17

3

1

3

2

3

1

2

3

1

2

5

18

1

3

1

1

2

4

1

1

2

2

3

18

2

2

2

1

3

1

1

3

3

3

6

18

3

1

1

3

1

2

3

3

1

1

1

19

1

3

1

3

1

1

1

3

3

2

3

19

2

3

2

1

2

5

3

1

1

1

3

19

3

1

3

2

2

6

1

2

2

2

2

20

1

1

1

3

2

5

1

3

3

1

4

20

2

3

2

3

1

1

2

2

2

2

2

20

3

2

3

2

3

3

1

1

1

2

1

21

1

3

3

1

1

6

1

3

1

3

1

21

2

1

2

3

3

2

2

1

1

2

3

21

3

3

1

2

2

2

3

3

2

1

2

22

1

2

1

1

1

5

1

1

2

3

1

22

2

3

1

3

3

6

2

2

1

2

4

22

3

1

3

1

2

1

3

1

3

1

3

23

1

1

1

1

1

2

1

1

2

2

4

23

2

3

3

1

2

1

3

3

3

1

1

23

3

2

3

2

1

4

3

2

3

3

3

24

1

2

2

2

2

5

2

3

3

3

1

24

2

2

1

1

3

6

1

2

1

1

3

24

3

1

3

3

3

2

3

1

2

3

6


Following common examples in the environmental economics literature, we anticipate three choice questions per survey. This allows the 72 profiles to be included (orthogonally blocked) in 24 unique survey booklets, as illustrated in table 2. The attribute levels applied within surveys are summarized in Table 2. Tables 3 through 7 present the resulting experimental design for each regional survey and the national survey. Monte Carlo experiments indicate that approximately 6 to 12 completed responses are required for each profile in order to achieve large sample statistical properties for choice experiments (Louviere et al. 2000, p. 104, citing Bunch and Batsell 1989). Following this guidance, the above design will require 24×12 = 288 completed surveys, or 12 completed surveys for each unique survey booklet. This will provide a total of 864 profile responses.



Table 2: Attribute Levels Included in Each Survey Version

Survey Version

Baseline

Max Change Assigned

Attribute Levels

1

2

3

4

5

6

Commercial Fish Populations

Northeast

42%

6%

43%

45%

48%

-

-

-

Southeast

39%

6%

40%

42%

45%

-

-

-

Pacific

56%

6%

57%

59%

62%

-

-

-

Inland

39%

6%

40%

42%

45%

-

-

-

National

51%

6%

52%

54%

57%

-

-

-

Fish Populations (all fish)

Northeast

26%

4%

27%

28%

30%

-

-

-

Southeast

24%

4%

25%

26%

28%

-

-

-

Pacific

32%

4%

33%

34%

36%

-

-

-

Inland

33%

4%

34%

35%

37%

-

-

-

National

30%

4%

31%

32%

34%

-

-

-

Fish Saved per Year

Northeast

0%

95%

5%

50%

95%

-

-

-

Southeast

0%

90%

25%

55%

90%

-

-

-

Pacific

0%

95%

2%

50%

95%

-

-

-

Inland

0%

95%

55%

75%

95%

-

-

-

National

0%

95%

25%

55%

95%

-

-

-

Aquatic Ecosystem Condition

Northeast

50%

4%

51%

52%

54%

-

-

-

Southeast

68%

4%

69%

70%

72%

-

-

-

Pacific

51%

4%

52%

53%

55%

-

-

-

Inland

42%

4%

43%

44%

46%

-

-

-

National

53%

4%

54%

55%

57%

-

-

-

Household Cost

Northeast

$0

$72

$12

$24

$36

$48

$60

$72

Southeast

$0

$72

$12

$24

$36

$48

$60

$72

Pacific

$0

$72

$12

$24

$36

$48

$60

$72

Inland

$0

$72

$12

$24

$36

$48

$60

$72

National

$0

$72

$12

$24

$36

$48

$60

$72



Table 3: Experimental Design for the Northeast Survey Region

Survey Version

Choice Question

Option A

Option B

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

1

1

45%

30%

5%

52%

$48

48%

27%

5%

54%

$48

1

2

48%

28%

50%

51%

$72

48%

30%

95%

52%

$60

1

3

48%

27%

50%

52%

$72

45%

27%

50%

51%

$12

2

1

48%

30%

5%

52%

$48

43%

28%

50%

52%

$12

2

2

45%

28%

50%

54%

$24

48%

27%

50%

51%

$36

2

3

43%

27%

5%

51%

$36

45%

27%

95%

54%

$24

3

1

48%

30%

5%

51%

$48

48%

27%

5%

54%

$72

3

2

45%

28%

95%

54%

$60

45%

28%

95%

52%

$12

3

3

43%

28%

5%

54%

$12

43%

30%

50%

54%

$24

4

1

45%

28%

95%

54%

$72

43%

30%

5%

51%

$72

4

2

48%

30%

95%

52%

$36

45%

28%

50%

54%

$36

4

3

45%

27%

50%

51%

$60

43%

27%

95%

52%

$48

5

1

48%

27%

50%

52%

$24

45%

30%

50%

54%

$48

5

2

45%

27%

5%

51%

$36

43%

28%

5%

51%

$36

5

3

43%

30%

50%

54%

$12

45%

28%

5%

51%

$60

6

1

43%

28%

95%

52%

$36

43%

30%

5%

54%

$72

6

2

48%

27%

95%

54%

$60

48%

28%

95%

52%

$36

6

3

43%

28%

50%

51%

$48

45%

27%

50%

51%

$24

7

1

43%

30%

50%

54%

$48

43%

30%

95%

51%

$12

7

2

48%

28%

95%

51%

$24

45%

27%

5%

54%

$60

7

3

45%

27%

5%

51%

$36

48%

28%

50%

52%

$72

8

1

45%

30%

5%

51%

$72

45%

30%

95%

54%

$36

8

2

43%

28%

95%

51%

$60

48%

28%

50%

51%

$72

8

3

45%

28%

50%

52%

$60

43%

28%

5%

52%

$24

9

1

48%

27%

5%

54%

$72

48%

28%

50%

51%

$48

9

2

43%

30%

50%

51%

$24

45%

30%

95%

52%

$72

9

3

43%

28%

95%

52%

$12

43%

27%

95%

54%

$48

10

1

48%

30%

50%

51%

$60

43%

30%

50%

51%

$24

10

2

45%

28%

95%

54%

$48

45%

30%

95%

52%

$48

10

3

43%

28%

50%

52%

$72

48%

28%

95%

54%

$36

11

1

45%

28%

5%

51%

$12

48%

30%

95%

52%

$60

11

2

48%

30%

95%

54%

$60

43%

27%

5%

54%

$24

11

3

43%

30%

50%

52%

$36

45%

28%

5%

51%

$12

12

1

48%

28%

95%

51%

$36

48%

30%

50%

51%

$48

12

2

43%

27%

5%

54%

$60

45%

27%

95%

54%

$60

12

3

43%

30%

50%

52%

$72

43%

27%

50%

52%

$72

13

1

48%

27%

95%

54%

$24

48%

30%

5%

52%

$24

13

2

45%

27%

50%

51%

$12

43%

28%

95%

54%

$48

13

3

45%

30%

5%

52%

$60

45%

30%

5%

51%

$60

14

1

45%

30%

50%

54%

$36

48%

28%

95%

54%

$24

14

2

48%

28%

5%

52%

$60

45%

27%

95%

52%

$72

14

3

48%

27%

95%

51%

$48

43%

30%

50%

51%

$60

15

1

43%

27%

5%

54%

$36

45%

30%

50%

52%

$12

15

2

45%

27%

50%

52%

$48

48%

27%

5%

52%

$24

15

3

45%

30%

95%

51%

$72

43%

28%

50%

54%

$60

16

1

43%

28%

5%

54%

$48

48%

28%

50%

54%

$60

16

2

48%

30%

50%

54%

$48

45%

28%

5%

51%

$36

16

3

45%

27%

5%

52%

$72

43%

30%

95%

51%

$24

17

1

45%

27%

95%

54%

$48

45%

28%

95%

51%

$72

17

2

45%

27%

95%

52%

$24

45%

30%

50%

54%

$12

17

3

48%

28%

5%

54%

$12

48%

27%

50%

52%

$60

18

1

43%

27%

95%

52%

$48

43%

28%

5%

52%

$36

18

2

45%

27%

50%

54%

$12

48%

30%

5%

54%

$72

18

3

43%

30%

5%

51%

$24

48%

27%

95%

51%

$12

19

1

43%

30%

95%

51%

$12

48%

30%

5%

52%

$36

19

2

45%

27%

95%

52%

$60

43%

27%

95%

51%

$36

19

3

48%

28%

5%

52%

$72

45%

28%

5%

52%

$24

20

1

43%

30%

5%

52%

$60

48%

30%

5%

51%

$48

20

2

45%

30%

95%

51%

$12

45%

28%

50%

52%

$24

20

3

48%

28%

50%

54%

$36

43%

27%

5%

52%

$12

21

1

48%

27%

95%

51%

$72

48%

27%

5%

54%

$12

21

2

45%

30%

5%

54%

$24

43%

27%

50%

52%

$36

21

3

43%

28%

95%

52%

$24

48%

28%

95%

51%

$24

22

1

43%

27%

50%

51%

$60

43%

28%

5%

54%

$12

22

2

43%

30%

95%

54%

$72

45%

27%

50%

52%

$48

22

3

48%

27%

5%

52%

$12

43%

30%

95%

51%

$36

23

1

43%

27%

5%

51%

$24

43%

28%

5%

52%

$48

23

2

48%

27%

95%

52%

$12

48%

30%

95%

51%

$12

23

3

48%

28%

50%

51%

$48

45%

30%

95%

54%

$36

24

1

45%

28%

50%

52%

$60

48%

30%

50%

54%

$12

24

2

43%

27%

50%

54%

$72

45%

27%

5%

51%

$36

24

3

48%

30%

5%

54%

$24

43%

28%

95%

54%

$72



Table 4: Experimental Design for the Southeast Survey Region

Survey Version

Choice Question

Option A

Option B

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

1

1

42%

28%

25%

70%

$48

45%

25%

25%

72%

$48

1

2

45%

26%

55%

69%

$72

45%

28%

90%

70%

$60

1

3

45%

25%

55%

70%

$72

42%

25%

55%

69%

$12

2

1

45%

28%

25%

70%

$48

40%

26%

55%

70%

$12

2

2

42%

26%

55%

72%

$24

45%

25%

55%

69%

$36

2

3

40%

25%

25%

69%

$36

42%

25%

90%

72%

$24

3

1

45%

28%

25%

69%

$48

45%

25%

25%

72%

$72

3

2

42%

26%

90%

72%

$60

42%

26%

90%

70%

$12

3

3

40%

26%

25%

72%

$12

40%

28%

55%

72%

$24

4

1

42%

26%

90%

72%

$72

40%

28%

25%

69%

$72

4

2

45%

28%

90%

70%

$36

42%

26%

55%

72%

$36

4

3

42%

25%

55%

69%

$60

40%

25%

90%

70%

$48

5

1

45%

25%

55%

70%

$24

42%

28%

55%

72%

$48

5

2

42%

25%

25%

69%

$36

40%

26%

25%

69%

$36

5

3

40%

28%

55%

72%

$12

42%

26%

25%

69%

$60

6

1

40%

26%

90%

70%

$36

40%

28%

25%

72%

$72

6

2

45%

25%

90%

72%

$60

45%

26%

90%

70%

$36

6

3

40%

26%

55%

69%

$48

42%

25%

55%

69%

$24

7

1

40%

28%

55%

72%

$48

40%

28%

90%

69%

$12

7

2

45%

26%

90%

69%

$24

42%

25%

25%

72%

$60

7

3

42%

25%

25%

69%

$36

45%

26%

55%

70%

$72

8

1

42%

28%

25%

69%

$72

42%

28%

90%

72%

$36

8

2

40%

26%

90%

69%

$60

45%

26%

55%

69%

$72

8

3

42%

26%

55%

70%

$60

40%

26%

25%

70%

$24

9

1

45%

25%

25%

72%

$72

45%

26%

55%

69%

$48

9

2

40%

28%

55%

69%

$24

42%

28%

90%

70%

$72

9

3

40%

26%

90%

70%

$12

40%

25%

90%

72%

$48

10

1

45%

28%

55%

69%

$60

40%

28%

55%

69%

$24

10

2

42%

26%

90%

72%

$48

42%

28%

90%

70%

$48

10

3

40%

26%

55%

70%

$72

45%

26%

90%

72%

$36

11

1

42%

26%

25%

69%

$12

45%

28%

90%

70%

$60

11

2

45%

28%

90%

72%

$60

40%

25%

25%

72%

$24

11

3

40%

28%

55%

70%

$36

42%

26%

25%

69%

$12

12

1

45%

26%

90%

69%

$36

45%

28%

55%

69%

$48

12

2

40%

25%

25%

72%

$60

42%

25%

90%

72%

$60

12

3

40%

28%

55%

70%

$72

40%

25%

55%

70%

$72

13

1

45%

25%

90%

72%

$24

45%

28%

25%

70%

$24

13

2

42%

25%

55%

69%

$12

40%

26%

90%

72%

$48

13

3

42%

28%

25%

70%

$60

42%

28%

25%

69%

$60

14

1

42%

28%

55%

72%

$36

45%

26%

90%

72%

$24

14

2

45%

26%

25%

70%

$60

42%

25%

90%

70%

$72

14

3

45%

25%

90%

69%

$48

40%

28%

55%

69%

$60

15

1

40%

25%

25%

72%

$36

42%

28%

55%

70%

$12

15

2

42%

25%

55%

70%

$48

45%

25%

25%

70%

$24

15

3

42%

28%

90%

69%

$72

40%

26%

55%

72%

$60

16

1

40%

26%

25%

72%

$48

45%

26%

55%

72%

$60

16

2

45%

28%

55%

72%

$48

42%

26%

25%

69%

$36

16

3

42%

25%

25%

70%

$72

40%

28%

90%

69%

$24

17

1

42%

25%

90%

72%

$48

42%

26%

90%

69%

$72

17

2

42%

25%

90%

70%

$24

42%

28%

55%

72%

$12

17

3

45%

26%

25%

72%

$12

45%

25%

55%

70%

$60

18

1

40%

25%

90%

70%

$48

40%

26%

25%

70%

$36

18

2

42%

25%

55%

72%

$12

45%

28%

25%

72%

$72

18

3

40%

28%

25%

69%

$24

45%

25%

90%

69%

$12

19

1

40%

28%

90%

69%

$12

45%

28%

25%

70%

$36

19

2

42%

25%

90%

70%

$60

40%

25%

90%

69%

$36

19

3

45%

26%

25%

70%

$72

42%

26%

25%

70%

$24

20

1

40%

28%

25%

70%

$60

45%

28%

25%

69%

$48

20

2

42%

28%

90%

69%

$12

42%

26%

55%

70%

$24

20

3

45%

26%

55%

72%

$36

40%

25%

25%

70%

$12

21

1

45%

25%

90%

69%

$72

45%

25%

25%

72%

$12

21

2

42%

28%

25%

72%

$24

40%

25%

55%

70%

$36

21

3

40%

26%

90%

70%

$24

45%

26%

90%

69%

$24

22

1

40%

25%

55%

69%

$60

40%

26%

25%

72%

$12

22

2

40%

28%

90%

72%

$72

42%

25%

55%

70%

$48

22

3

45%

25%

25%

70%

$12

40%

28%

90%

69%

$36

23

1

40%

25%

25%

69%

$24

40%

26%

25%

70%

$48

23

2

45%

25%

90%

70%

$12

45%

28%

90%

69%

$12

23

3

45%

26%

55%

69%

$48

42%

28%

90%

72%

$36

24

1

42%

26%

55%

70%

$60

45%

28%

55%

72%

$12

24

2

40%

25%

55%

72%

$72

42%

25%

25%

69%

$36

24

3

45%

28%

25%

72%

$24

40%

26%

90%

72%

$72



Table 5: Experimental Design for the Pacific Survey Region

Survey Version

Choice Question

Option A

Option B

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

1

1

59%

36%

2%

53%

$48

62%

33%

2%

55%

$48

1

2

62%

34%

50%

52%

$72

62%

36%

95%

53%

$60

1

3

62%

33%

50%

53%

$72

59%

33%

50%

52%

$12

2

1

62%

36%

2%

53%

$48

57%

34%

50%

53%

$12

2

2

59%

34%

50%

55%

$24

62%

33%

50%

52%

$36

2

3

57%

33%

2%

52%

$36

59%

33%

95%

55%

$24

3

1

62%

36%

2%

52%

$48

62%

33%

2%

55%

$72

3

2

59%

34%

95%

55%

$60

59%

34%

95%

53%

$12

3

3

57%

34%

2%

55%

$12

57%

36%

50%

55%

$24

4

1

59%

34%

95%

55%

$72

57%

36%

2%

52%

$72

4

2

62%

36%

95%

53%

$36

59%

34%

50%

55%

$36

4

3

59%

33%

50%

52%

$60

57%

33%

95%

53%

$48

5

1

62%

33%

50%

53%

$24

59%

36%

50%

55%

$48

5

2

59%

33%

2%

52%

$36

57%

34%

2%

52%

$36

5

3

57%

36%

50%

55%

$12

59%

34%

2%

52%

$60

6

1

57%

34%

95%

53%

$36

57%

36%

2%

55%

$72

6

2

62%

33%

95%

55%

$60

62%

34%

95%

53%

$36

6

3

57%

34%

50%

52%

$48

59%

33%

50%

52%

$24

7

1

57%

36%

50%

55%

$48

57%

36%

95%

52%

$12

7

2

62%

34%

95%

52%

$24

59%

33%

2%

55%

$60

7

3

59%

33%

2%

52%

$36

62%

34%

50%

53%

$72

8

1

59%

36%

2%

52%

$72

59%

36%

95%

55%

$36

8

2

57%

34%

95%

52%

$60

62%

34%

50%

52%

$72

8

3

59%

34%

50%

53%

$60

57%

34%

2%

53%

$24

9

1

62%

33%

2%

55%

$72

62%

34%

50%

52%

$48

9

2

57%

36%

50%

52%

$24

59%

36%

95%

53%

$72

9

3

57%

34%

95%

53%

$12

57%

33%

95%

55%

$48

10

1

62%

36%

50%

52%

$60

57%

36%

50%

52%

$24

10

2

59%

34%

95%

55%

$48

59%

36%

95%

53%

$48

10

3

57%

34%

50%

53%

$72

62%

34%

95%

55%

$36

11

1

59%

34%

2%

52%

$12

62%

36%

95%

53%

$60

11

2

62%

36%

95%

55%

$60

57%

33%

2%

55%

$24

11

3

57%

36%

50%

53%

$36

59%

34%

2%

52%

$12

12

1

62%

34%

95%

52%

$36

62%

36%

50%

52%

$48

12

2

57%

33%

2%

55%

$60

59%

33%

95%

55%

$60

12

3

57%

36%

50%

53%

$72

57%

33%

50%

53%

$72

13

1

62%

33%

95%

55%

$24

62%

36%

2%

53%

$24

13

2

59%

33%

50%

52%

$12

57%

34%

95%

55%

$48

13

3

59%

36%

2%

53%

$60

59%

36%

2%

52%

$60

14

1

59%

36%

50%

55%

$36

62%

34%

95%

55%

$24

14

2

62%

34%

2%

53%

$60

59%

33%

95%

53%

$72

14

3

62%

33%

95%

52%

$48

57%

36%

50%

52%

$60

15

1

57%

33%

2%

55%

$36

59%

36%

50%

53%

$12

15

2

59%

33%

50%

53%

$48

62%

33%

2%

53%

$24

15

3

59%

36%

95%

52%

$72

57%

34%

50%

55%

$60

16

1

57%

34%

2%

55%

$48

62%

34%

50%

55%

$60

16

2

62%

36%

50%

55%

$48

59%

34%

2%

52%

$36

16

3

59%

33%

2%

53%

$72

57%

36%

95%

52%

$24

17

1

59%

33%

95%

55%

$48

59%

34%

95%

52%

$72

17

2

59%

33%

95%

53%

$24

59%

36%

50%

55%

$12

17

3

62%

34%

2%

55%

$12

62%

33%

50%

53%

$60

18

1

57%

33%

95%

53%

$48

57%

34%

2%

53%

$36

18

2

59%

33%

50%

55%

$12

62%

36%

2%

55%

$72

18

3

57%

36%

2%

52%

$24

62%

33%

95%

52%

$12

19

1

57%

36%

95%

52%

$12

62%

36%

2%

53%

$36

19

2

59%

33%

95%

53%

$60

57%

33%

95%

52%

$36

19

3

62%

34%

2%

53%

$72

59%

34%

2%

53%

$24

20

1

57%

36%

2%

53%

$60

62%

36%

2%

52%

$48

20

2

59%

36%

95%

52%

$12

59%

34%

50%

53%

$24

20

3

62%

34%

50%

55%

$36

57%

33%

2%

53%

$12

21

1

62%

33%

95%

52%

$72

62%

33%

2%

55%

$12

21

2

59%

36%

2%

55%

$24

57%

33%

50%

53%

$36

21

3

57%

34%

95%

53%

$24

62%

34%

95%

52%

$24

22

1

57%

33%

50%

52%

$60

57%

34%

2%

55%

$12

22

2

57%

36%

95%

55%

$72

59%

33%

50%

53%

$48

22

3

62%

33%

2%

53%

$12

57%

36%

95%

52%

$36

23

1

57%

33%

2%

52%

$24

57%

34%

2%

53%

$48

23

2

62%

33%

95%

53%

$12

62%

36%

95%

52%

$12

23

3

62%

34%

50%

52%

$48

59%

36%

95%

55%

$36

24

1

59%

34%

50%

53%

$60

62%

36%

50%

55%

$12

24

2

57%

33%

50%

55%

$72

59%

33%

2%

52%

$36

24

3

62%

36%

2%

55%

$24

57%

34%

95%

55%

$72



Table 6: Experimental Design for the Inland Survey Region

Survey Version

Choice Question

Option A

Option B

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

1

1

42%

37%

55%

44%

$48

45%

34%

55%

46%

$48

1

2

45%

35%

75%

43%

$72

45%

37%

95%

44%

$60

1

3

45%

34%

75%

44%

$72

42%

34%

75%

43%

$12

2

1

45%

37%

55%

44%

$48

40%

35%

75%

44%

$12

2

2

42%

35%

75%

46%

$24

45%

34%

75%

43%

$36

2

3

40%

34%

55%

43%

$36

42%

34%

95%

46%

$24

3

1

45%

37%

55%

43%

$48

45%

34%

55%

46%

$72

3

2

42%

35%

95%

46%

$60

42%

35%

95%

44%

$12

3

3

40%

35%

55%

46%

$12

40%

37%

75%

46%

$24

4

1

42%

35%

95%

46%

$72

40%

37%

55%

43%

$72

4

2

45%

37%

95%

44%

$36

42%

35%

75%

46%

$36

4

3

42%

34%

75%

43%

$60

40%

34%

95%

44%

$48

5

1

45%

34%

75%

44%

$24

42%

37%

75%

46%

$48

5

2

42%

34%

55%

43%

$36

40%

35%

55%

43%

$36

5

3

40%

37%

75%

46%

$12

42%

35%

55%

43%

$60

6

1

40%

35%

95%

44%

$36

40%

37%

55%

46%

$72

6

2

45%

34%

95%

46%

$60

45%

35%

95%

44%

$36

6

3

40%

35%

75%

43%

$48

42%

34%

75%

43%

$24

7

1

40%

37%

75%

46%

$48

40%

37%

95%

43%

$12

7

2

45%

35%

95%

43%

$24

42%

34%

55%

46%

$60

7

3

42%

34%

55%

43%

$36

45%

35%

75%

44%

$72

8

1

42%

37%

55%

43%

$72

42%

37%

95%

46%

$36

8

2

40%

35%

95%

43%

$60

45%

35%

75%

43%

$72

8

3

42%

35%

75%

44%

$60

40%

35%

55%

44%

$24

9

1

45%

34%

55%

46%

$72

45%

35%

75%

43%

$48

9

2

40%

37%

75%

43%

$24

42%

37%

95%

44%

$72

9

3

40%

35%

95%

44%

$12

40%

34%

95%

46%

$48

10

1

45%

37%

75%

43%

$60

40%

37%

75%

43%

$24

10

2

42%

35%

95%

46%

$48

42%

37%

95%

44%

$48

10

3

40%

35%

75%

44%

$72

45%

35%

95%

46%

$36

11

1

42%

35%

55%

43%

$12

45%

37%

95%

44%

$60

11

2

45%

37%

95%

46%

$60

40%

34%

55%

46%

$24

11

3

40%

37%

75%

44%

$36

42%

35%

55%

43%

$12

12

1

45%

35%

95%

43%

$36

45%

37%

75%

43%

$48

12

2

40%

34%

55%

46%

$60

42%

34%

95%

46%

$60

12

3

40%

37%

75%

44%

$72

40%

34%

75%

44%

$72

13

1

45%

34%

95%

46%

$24

45%

37%

55%

44%

$24

13

2

42%

34%

75%

43%

$12

40%

35%

95%

46%

$48

13

3

42%

37%

55%

44%

$60

42%

37%

55%

43%

$60

14

1

42%

37%

75%

46%

$36

45%

35%

95%

46%

$24

14

2

45%

35%

55%

44%

$60

42%

34%

95%

44%

$72

14

3

45%

34%

95%

43%

$48

40%

37%

75%

43%

$60

15

1

40%

34%

55%

46%

$36

42%

37%

75%

44%

$12

15

2

42%

34%

75%

44%

$48

45%

34%

55%

44%

$24

15

3

42%

37%

95%

43%

$72

40%

35%

75%

46%

$60

16

1

40%

35%

55%

46%

$48

45%

35%

75%

46%

$60

16

2

45%

37%

75%

46%

$48

42%

35%

55%

43%

$36

16

3

42%

34%

55%

44%

$72

40%

37%

95%

43%

$24

17

1

42%

34%

95%

46%

$48

42%

35%

95%

43%

$72

17

2

42%

34%

95%

44%

$24

42%

37%

75%

46%

$12

17

3

45%

35%

55%

46%

$12

45%

34%

75%

44%

$60

18

1

40%

34%

95%

44%

$48

40%

35%

55%

44%

$36

18

2

42%

34%

75%

46%

$12

45%

37%

55%

46%

$72

18

3

40%

37%

55%

43%

$24

45%

34%

95%

43%

$12

19

1

40%

37%

95%

43%

$12

45%

37%

55%

44%

$36

19

2

42%

34%

95%

44%

$60

40%

34%

95%

43%

$36

19

3

45%

35%

55%

44%

$72

42%

35%

55%

44%

$24

20

1

40%

37%

55%

44%

$60

45%

37%

55%

43%

$48

20

2

42%

37%

95%

43%

$12

42%

35%

75%

44%

$24

20

3

45%

35%

75%

46%

$36

40%

34%

55%

44%

$12

21

1

45%

34%

95%

43%

$72

45%

34%

55%

46%

$12

21

2

42%

37%

55%

46%

$24

40%

34%

75%

44%

$36

21

3

40%

35%

95%

44%

$24

45%

35%

95%

43%

$24

22

1

40%

34%

75%

43%

$60

40%

35%

55%

46%

$12

22

2

40%

37%

95%

46%

$72

42%

34%

75%

44%

$48

22

3

45%

34%

55%

44%

$12

40%

37%

95%

43%

$36

23

1

40%

34%

55%

43%

$24

40%

35%

55%

44%

$48

23

2

45%

34%

95%

44%

$12

45%

37%

95%

43%

$12

23

3

45%

35%

75%

43%

$48

42%

37%

95%

46%

$36

24

1

42%

35%

75%

44%

$60

45%

37%

75%

46%

$12

24

2

40%

34%

75%

46%

$72

42%

34%

55%

43%

$36

24

3

45%

37%

55%

46%

$24

40%

35%

95%

46%

$72



Table 7: Experimental Design for the National Survey

Survey Version

Choice Question

Option A

Option B

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

Com. Fish.

Fish. Pop

Fish Saved

Aq. Cond.

Cost

1

1

54%

34%

25%

55%

$48

57%

31%

25%

57%

$48

1

2

57%

32%

55%

54%

$72

57%

34%

95%

55%

$60

1

3

57%

31%

55%

55%

$72

54%

31%

55%

54%

$12

2

1

57%

34%

25%

55%

$48

52%

32%

55%

55%

$12

2

2

54%

32%

55%

57%

$24

57%

31%

55%

54%

$36

2

3

52%

31%

25%

54%

$36

54%

31%

95%

57%

$24

3

1

57%

34%

25%

54%

$48

57%

31%

25%

57%

$72

3

2

54%

32%

95%

57%

$60

54%

32%

95%

55%

$12

3

3

52%

32%

25%

57%

$12

52%

34%

55%

57%

$24

4

1

54%

32%

95%

57%

$72

52%

34%

25%

54%

$72

4

2

57%

34%

95%

55%

$36

54%

32%

55%

57%

$36

4

3

54%

31%

55%

54%

$60

52%

31%

95%

55%

$48

5

1

57%

31%

55%

55%

$24

54%

34%

55%

57%

$48

5

2

54%

31%

25%

54%

$36

52%

32%

25%

54%

$36

5

3

52%

34%

55%

57%

$12

54%

32%

25%

54%

$60

6

1

52%

32%

95%

55%

$36

52%

34%

25%

57%

$72

6

2

57%

31%

95%

57%

$60

57%

32%

95%

55%

$36

6

3

52%

32%

55%

54%

$48

54%

31%

55%

54%

$24

7

1

52%

34%

55%

57%

$48

52%

34%

95%

54%

$12

7

2

57%

32%

95%

54%

$24

54%

31%

25%

57%

$60

7

3

54%

31%

25%

54%

$36

57%

32%

55%

55%

$72

8

1

54%

34%

25%

54%

$72

54%

34%

95%

57%

$36

8

2

52%

32%

95%

54%

$60

57%

32%

55%

54%

$72

8

3

54%

32%

55%

55%

$60

52%

32%

25%

55%

$24

9

1

57%

31%

25%

57%

$72

57%

32%

55%

54%

$48

9

2

52%

34%

55%

54%

$24

54%

34%

95%

55%

$72

9

3

52%

32%

95%

55%

$12

52%

31%

95%

57%

$48

10

1

57%

34%

55%

54%

$60

52%

34%

55%

54%

$24

10

2

54%

32%

95%

57%

$48

54%

34%

95%

55%

$48

10

3

52%

32%

55%

55%

$72

57%

32%

95%

57%

$36

11

1

54%

32%

25%

54%

$12

57%

34%

95%

55%

$60

11

2

57%

34%

95%

57%

$60

52%

31%

25%

57%

$24

11

3

52%

34%

55%

55%

$36

54%

32%

25%

54%

$12

12

1

57%

32%

95%

54%

$36

57%

34%

55%

54%

$48

12

2

52%

31%

25%

57%

$60

54%

31%

95%

57%

$60

12

3

52%

34%

55%

55%

$72

52%

31%

55%

55%

$72

13

1

57%

31%

95%

57%

$24

57%

34%

25%

55%

$24

13

2

54%

31%

55%

54%

$12

52%

32%

95%

57%

$48

13

3

54%

34%

25%

55%

$60

54%

34%

25%

54%

$60

14

1

54%

34%

55%

57%

$36

57%

32%

95%

57%

$24

14

2

57%

32%

25%

55%

$60

54%

31%

95%

55%

$72

14

3

57%

31%

95%

54%

$48

52%

34%

55%

54%

$60

15

1

52%

31%

25%

57%

$36

54%

34%

55%

55%

$12

15

2

54%

31%

55%

55%

$48

57%

31%

25%

55%

$24

15

3

54%

34%

95%

54%

$72

52%

32%

55%

57%

$60

16

1

52%

32%

25%

57%

$48

57%

32%

55%

57%

$60

16

2

57%

34%

55%

57%

$48

54%

32%

25%

54%

$36

16

3

54%

31%

25%

55%

$72

52%

34%

95%

54%

$24

17

1

54%

31%

95%

57%

$48

54%

32%

95%

54%

$72

17

2

54%

31%

95%

55%

$24

54%

34%

55%

57%

$12

17

3

57%

32%

25%

57%

$12

57%

31%

55%

55%

$60

18

1

52%

31%

95%

55%

$48

52%

32%

25%

55%

$36

18

2

54%

31%

55%

57%

$12

57%

34%

25%

57%

$72

18

3

52%

34%

25%

54%

$24

57%

31%

95%

54%

$12

19

1

52%

34%

95%

54%

$12

57%

34%

25%

55%

$36

19

2

54%

31%

95%

55%

$60

52%

31%

95%

54%

$36

19

3

57%

32%

25%

55%

$72

54%

32%

25%

55%

$24

20

1

52%

34%

25%

55%

$60

57%

34%

25%

54%

$48

20

2

54%

34%

95%

54%

$12

54%

32%

55%

55%

$24

20

3

57%

32%

55%

57%

$36

52%

31%

25%

55%

$12

21

1

57%

31%

95%

54%

$72

57%

31%

25%

57%

$12

21

2

54%

34%

25%

57%

$24

52%

31%

55%

55%

$36

21

3

52%

32%

95%

55%

$24

57%

32%

95%

54%

$24

22

1

52%

31%

55%

54%

$60

52%

32%

25%

57%

$12

22

2

52%

34%

95%

57%

$72

54%

31%

55%

55%

$48

22

3

57%

31%

25%

55%

$12

52%

34%

95%

54%

$36

23

1

52%

31%

25%

54%

$24

52%

32%

25%

55%

$48

23

2

57%

31%

95%

55%

$12

57%

34%

95%

54%

$12

23

3

57%

32%

55%

54%

$48

54%

34%

95%

57%

$36

24

1

54%

32%

55%

55%

$60

57%

34%

55%

57%

$12

24

2

52%

31%

55%

57%

$72

54%

31%

25%

54%

$36

24

3

57%

34%

25%

57%

$24

52%

32%

95%

57%

$72



Sample Sizes for Maximum Acceptable Sampling Error

The goal of the choice experiment is to estimate regression coefficients from mixed or conditional logit models that may be used to estimate willingness to pay for multi-attribute policy alternatives, or the likelihood of choosing a given multi-attribute alternative, following standard random utility modeling procedures (Haab and McConnell 2002). Required sample sizes to estimate population parameters with given degrees of freedom are drawn from standard statistical theory, as described by Dillman (2000).

The maximum acceptable sampling error for predicting response probabilities (the likelihood of choosing a given alternative) in the present case is ±10%, assuming a true response probability of 50% associated with a utility indifference point. Given the survey population size, this level of precision requires a minimum sample size of approximately 96 observations. The number of observations (completed surveys) required to obtain large sample properties for the choice experiment design provide more than sufficient observations to obtain this required precision for population parameters. The choice experiment design requires 864 completed profile responses from 288 surveys. Statistical theory shows that 400 assumed-independent profile responses will allow a 50% choice probability to be estimated within ±4.9 percent, at a 95% confidence level. With only a single choice question per survey (288 responses), the same probability and confidence level can be obtained within approximately ±8.2 percent; this is also within acceptable limits.



Attachment 6: Preview Letter to Mail Survey Recipients





«date»

«given_name» «surname»

«address»

«city», «state» «zip_code»


Dear «title» «surname»:


I am writing this letter to let you know about an important survey regarding environmental protection and government regulations in the Northeast U.S. Over time, human activities have caused many changes in Northeast’s rivers, streams and bays. The Environmental Protection Agency is considering policies that could impact the quality of fish and aquatic habitat in these areas. These policies can have different effects and costs. Because of this, it is important to know what types of policies are supported by Northeast residents.


Through a random process, your household was selected to receive a survey about some of these policies. This survey, Fish and Aquatic Habitat — A Survey of Northeast Residents, will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast. It will arrive in the next two weeks.


We selected your address, not you personally, as part of a scientifically-determined regional sample. Your participation is voluntary, and there are no penalties for not answering any questions. Your help, however, is very important. We cannot send this survey to everyone, so your answers will represent the opinions of many other Northeast residents like you and will provide valuable information that will help improve the regional survey.


Sometime in the next few weeks, you will receive a survey booklet by mail. By filling out this survey when it arrives, you will be participating in an important study that will help officials understand your priorities for the environment and the use of public funds. Your participation is extremely important to ensure that the survey results are complete and accurate. Your answers will be kept confidential to the extent provided by law. Please keep in mind that by quickly returning your complete survey, you will be helping to keep down government costs.


We hope that you find this survey important and interesting and thank you for your assistance in this important project.



Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division


Attachment 7: Cover Letter to Mail Survey Recipients






<date>


<given name> <surname>

<address>

<city>, <state> <zip code>-<zip+4>


Dear <title> <surname>:


Within the last two weeks you received a letter informing you that through a random process, your household was selected to receive a short survey regarding environmental protection and government regulations in the Northeast U.S. Thank you for your participation—the survey booklet is enclosed with this letter. Your answers to the survey will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast. By filling out this survey, you will be participating in an important study that will help government officials understand your priorities for the environment and regulations.


Your responses to this survey are extremely important to ensure that the survey results are complete and accurate. Over time, human activities have caused many changes in Northeast’s rivers, streams and bays. The Environmental Protection Agency is considering policies that could impact the quality of fish and aquatic habitat in these areas. These policies can have different effects and costs. Because of this, it is important to know what types of policies are supported by Northeast residents.


All answers to the survey are kept confidential to the extent provided by law. Once we have received your survey, we will delete your name from all lists, so that your responses can never be traced back to you. Of course, your participation is voluntary and you may refuse to answer any or all questions.

We hope that you find this survey important and interesting, and thank you for your assistance in this important project. We would greatly appreciate if you could return the survey in the near future.



Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division

Attachment 8: Post Card Reminder to Mail Survey Recipients



FRONT




Abt SRBI

Government Services Divison

8403 Colesville Road, Suite 820

Silver Spring, MD, 20910






















B



Last week a survey was mailed to you concerning environmental protection and government regulations in the Northeast U.S. If you have already returned your completed survey, please accept our sincere thanks.


If you have not yet completed your survey, we ask that you please do so today. You are one of a select few who have been chosen to participate—your answers will help us understand your priorities for the environment and regulations in the Northeast U.S.


If you have misplaced your survey, please contact Ryan Stapler at

(617) 520-3524 or [email protected] for a replacement.


Regards,


Mary T. Smith

Environmental Protection Agency


ACK

















Attachment 9: Cover Letter to Recipients of the Second Survey Mailing






<date>


<given name> <surname>

<address>

<city>, <state> <zip code>-<zip+4>


Dear <title> <surname>:


Within the last few weeks a survey was sent to you regarding environmental protection and government regulations in the Northeast U.S. Our records indicate that you have not yet returned a completed survey. You are one of a select few who have been chosen to participate – your answers to the survey will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast. If you have not yet returned your survey, we ask that you please do so today. Another copy of the survey is enclosed with this letter.


Your responses to this survey are extremely important to ensure that the survey results are complete and accurate. Over time, human activities have caused many changes in Northeast’s rivers, streams and bays. The Environmental Protection Agency is considering policies that could impact the quality of fish and aquatic habitat in these areas. These policies can have different effects and costs. Because of this, it is important to know what types of policies are supported by Northeast residents.


All answers to the survey are kept confidential to the extent provided by law. Once we have received your survey, we will delete your name from all lists, so that your responses can never be traced back to you. Of course, your participation is voluntary and you may refuse to answer any or all questions.


We hope that you find this survey important and interesting, and thank you for your assistance in this important project. We would greatly appreciate if you could return the survey in the near future.



Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division


Attachment 10: Reminder Letter to Mail Survey Recipients



<date>


<given name> <surname>

<address>

<city>, <state> <zip code>-<zip+4>


Dear <title> <surname>:


Within the last week a survey was mailed to you concerning environmental protection and government regulations in the Northeast U.S. Through a random process, your address was selected to receive the survey as part of a scientifically-determined regional sample. If you have not yet completed your survey, we ask that you please do so today. You are one of a select few who have been chosen to participate – your answers to the survey will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast.


Your responses to this survey are extremely important to ensure that the survey results are complete and accurate. Over time, human activities have caused many changes in Northeast’s rivers, streams and bays. The Environmental Protection Agency is considering policies that could impact the quality of fish and aquatic habitat in these areas. These policies can have different effects and costs. Because of this, it is important to know what types of policies are supported by Northeast residents.


All answers to the survey are kept confidential to the extent provided by law. Once we have received your survey, we will delete your name from all lists, so that your responses can never be traced back to you. Of course, your participation is voluntary and you may refuse to answer any or all questions.

We hope that you find the survey important and interesting, and thank you for your assistance in this important project. We would greatly appreciate if you could return the completed survey in the near future. If you have misplaced your survey, please contact Ryan Stapler at (617) 520-3524 or [email protected] for a replacement.




Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division

Attachment 11: Letter to Participants in the Telephone Non-response Survey






<date>


<given name> <surname>

<address>

<city>, <state> <zip code>-<zip+4>


Dear <title> <surname>:


Within the last few weeks you received a survey regarding environmental protection and government regulations in the Northeast U.S. Through a random process, your address was selected to receive the survey as part of a scientifically-determined regional sample. Our records indicate that you did not return the completed survey. While we are no longer asking that you complete the full survey, we will be contacting you by phone to participate in a brief telephone survey that is expected to take less than 5 minutes. Included with this letter is $2 in cash as an unconditional incentive for your participation in the telephone survey.


Your answers to the telephone survey will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast. You will be participating in an important study that will help government officials understand your priorities for the environment and regulations. Your responses are extremely important to ensure that the survey results are complete and accurate. All answers to the telephone survey are kept confidential to the extent provided by law. After you participate we will delete your name from all lists, so that your responses can never be traced back to you. Of course, your participation is voluntary and you may refuse to answer any or all questions.


We hope that you will find the telephone survey important and interesting, and thank you for your assistance in this important project.



Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division

Attachment 12: Cover Letter to Recipients of the Priority Mail Non-response Questionnaire






<date>


<given name> <surname>

<address>

<city>, <state> <zip code>-<zip+4>


Dear <title> <surname>:


Within the last few weeks you received a survey regarding environmental protection and government regulations in the Northeast U.S. Through a random process, your address was selected to receive the survey as part of a scientifically-determined regional sample. Our records indicate that you did not return the completed survey. While we are no longer asking that you complete the full survey, a brief questionnaire is enclosed with this letter that is expected to take less than 5 minutes to complete. Also included is $2 in cash as an unconditional incentive for your participation.


Your answers to the questionnaire will help officials from the Environmental Protection Agency (EPA) to better understand the value of policies which would affect the future of fish and aquatic habitat in the Northeast. By filling out this questionnaire, you will be participating in an important study that will help government officials understand your priorities for the environment and regulations. Your responses to this questionnaire are extremely important to ensure that the survey results are complete and accurate. All answers to the survey are kept confidential to the extent provided by law. Once we have received your questionnaire, we will delete your name from all lists, so that your responses can never be traced back to you. Of course, your participation is voluntary and you may refuse to answer any or all questions.


We hope that you find this questionnaire important and interesting, and thank you for your assistance in this important project. We would greatly appreciate if you could return the questionnaire in the near future.



Sincerely,



Mary T. Smith, Director

U. S. Environmental Protection Agency

Engineering and Analysis Division



Attachment 13: Priority Mail Non-response Questionnaire


OMB Control No. 2040-XXXX

Approval expires XX/XX/XX






Fish and Aquatic Habitat

A Short Survey of U.S. Households










The public reporting and recordkeeping burden for this collection of information is estimated to average 5 minutes per response. Send comments on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including through the use of automated collection techniques to the Director, Collection Strategies Division, U.S. Environmental Protection Agency (2822T), 1200 Pennsylvania Ave., NW, Washington, D.C. 20460. Include the OMB control number in any correspondence. Do not send the completed survey to this address.

This is part of an important survey of U.S. residents for the Environmental Protection Agency, or EPA. It is a short questionnaire which should take no more than five minutes. This study will help us to better understand the value of environmental protection and public programs. Any answers you provide are kept confidential to the extent provided by law.


  1. What is your age? years

  2. What is your gender? Male Female

  3. What is the highest level of education that you have completed?

Less than high school

One or more years of college

High school or equivalent

Bachelor’s Degree

High school + technical school

Graduate Degree



  1. How many people live in your household? ________

  2. How many of these people are 16 years of age or older? ____

  3. How many of these people are 6 years of age or younger? ____

  4. What is your zip code? ______________

  5. Are you currently employed? Yes No

  6. Are you currently employed in the commercial fish industry? Yes No



  1. Compared to other issues that the government might address – such as public safety, education and health – how important is protecting aquatic ecosystem to you, on a scale of 1 to 5 where 1 is “not important” and 5 is “very important”?

1

2

3

4

5



  1. People have ideas about the extent to which the government should be involved in protecting the environment. On a scale of 1 to 5 where 1 is “not at all involved” and 5 is “highly involved”, how involved do you think the government should be in environmental protection?

1

2

3

4

5


1



  1. How many days did you participate in the following during the last year? For trips longer than one day, please count each day separately.

Boating / Canoeing / Kayaking

0

1-5

6-10

11-15

16+

Swimming / Going to the Beach

0

1-5

6-10

11-15

16+

Recreational Fishing (Fresh Water)

0

1-5

6-10

11-15

16+

Recreational Fishing (Salt Water)

0

1-5

6-10

11-15

16+

Shellfishing / Crabbing

0

1-5

6-10

11-15

16+

Scuba Diving / Snorkeling

0

1-5

6-10

11-15

16+


  1. Do you consume commercially caught fish or seafood? Yes No

  2. Do you consume recreationally caught fish or seafood? Yes No

  3. Are you of Hispanic or Latino origin? Yes No

  4. Which of the following racial categories describes you? You may select more than one.

    American Indian or Alaskan Native

    Asian

    Black or African American

    White

    Native Hawaiian or Other Pacific Islander


  5. Which would best describe your living situation?

    Rent your home or apartment

    Own your own home

    Live with family or friends and pay part of the rent or mortgage

    Live with family or friends and do not pay rent

    Other (please specify) ____________________________________

  6. What category comes closest to your total household income?

Less than $10,000

$60,000 to $79,999

$10,000 to $19,999

$80,000 to $99,999

$20,000 to $39,999

$100,000 to $249,999

$40,000 to $59,999

$250,000 or more

Thank you for participating in this very important survey!


2


Attachment 14: Telephone Non-response Screener Script

Fish and Aquatic Habitat Telephone Screener


Hello, this is _________________calling from the Abt Associates. We are conducting an important survey of U.S. residents for the Environmental Protection Agency, or EPA. This is not a sales call.


SL1 [ASK IF SAMPLE=LANDLINE]


Could you please tell me how many people, age 18 and older, live in this household?


IF NEEDED: This study will help us to better understand the value of environmental protection and public programs. Any answers you give are kept strictly confidential.


IF ASKED: This is a short survey which should take no more than five minutes.


0 None THANK AND SCREEN OUT

1 One

__ Number of persons 16+ SKIP TO SL1c

98 Refused (VOL) THANK AND TERMINATE – Soft Refusal

99 Refused (VOL) THANK AND TERMINATE – Hard Refusal


SL1b [ASK IF SAMPLE=LANDLINE]


May I speak with that person?


1 Rspn on line SKIP TO SL2

2 Rspn called to phone SKIP TO SL1d

3 Rspn unavailable SCHEDULE CALLBACK

8 Refused THANK AND TERMINATE – Soft Refusal

9 Refused THANK AND TERMINATE – Hard Refusal


SL1c [ASK IF SAMPLE=LANDLINE]


In order to select just one person to interview, may I please speak to the person in your household, age 18 or older, who (has had the most recent/will have the next) birthday?


1 Rspn on line SKIP TO SL2

2 Rspn called to phone

3 Rspn unavailable SCHEDULE CALLBACK

8 Refused THANK AND TERMINATE – Soft Refusal

9 Refused THANK AND TERMINATE – Hard Refusal


SL1d [ASK IF SAMPLE=LANDLINE]


Hello, this is _________________calling from the Abt Associates. We are conducting an important survey of U.S. residents for the Environmental Protection Agency, or EPA. Could we begin now?


IF ASKED: This is a short survey which should take no more than five minutes.


1 Yes

2 No time SCHEDULE CALLBACK

8 Refused THANK AND TERMINATE – Soft Refusal

9 Refused THANK AND TERMINATE – Hard Refusal


SL2 [ASK IF SAMPLE=LANDLINE]


Do you have a cell phone in addition to the line on which we’re speaking right now?


1 Yes, also have cell phone

2 No, this is only phone SKIP TO SA2

8 (VOL) Don’t know THANK AND END, screen out

9 Refused THANK AND END, soft refusal


SA1


Of all of the phone calls that you or your family receives, are…(Read List)


1 all or almost all calls received on cell phones,

2 some received on cell phones and some received on land lines, or

3 very few or none on cell phones.

8 (VOL) Don’t know

9 (VOL) Refused


SA2


Record gender from observation. (Ask only if Necessary)


1 Male

2 Female


Q1


First, compared to other issues that the government might address – such as public safety, education and health – how important is protecting aquatic ecosystem to you, on a scale of 1 to 5 where 1 is “not important” and 5 is “very important”? (MATCHES QUESTION 2 IN THE MAIL SURVEY)


1 2 3 4 5 (VOL) Don’t Know (VOL) Refused


Q2


People have ideas about the extent to which the government should be involved in protecting the environment. On a scale of 1 to 5 where 1 is “not at all involved” and 5 is “highly involved”, how involved do you think the government should be in environmental protection?


1 2 3 4 5 (VOL) Don’t Know (VOL) Refused


Q3a


Could you please tell me if you participated in each of following activities during the last year? (DO NOT ROTATE) MATCHES QUESTION 10 IN THE MAIL SURVEY)



1 Yes

2 No

8 (VOL) Don’t Know

9 (VOL) Refused


Boating, canoeing, or kayaking 1 2 8 9

Swimming/going to the beach 1 2 8 9

Fresh Water Recreational fishing 1 2 8 9

Salt Water Recreational fishing 1 2 8 9

Shell fishing or crabbing 1 2 8 9

Scuba diving or snorkeling 1 2 8 9


(IF THE RESPONDENT DID NOT PARTICIPATE IN THE ABOVE ACTIVITIES SCIP TO Q4)


Q3b


How many days did you participate in the following during the last year? For trips longer than one day, please count each day separately. (MATCHES QUESTION 10 IN THE MAIL SURVEY)


Boating, canoeing, or kayaking 1-5 6-10 11-15 16+

Swimming/going to the beach 1-5 6-10 11-15 16+

Fresh Water Recreational fishing 1-5 6-10 11-15 16+

Salt Water Recreational fishing 1-5 6-10 11-15 16+

Shell fishing or crabbing 1-5 6-10 11-15 16+ Scuba diving or snorkeling 1-5 6-10 11-15 16+


Q4

Do you consume commercially caught fish or seafood?


1 Yes

2 No

Q5

Do you consume recreationally caught fish or seafood?


1 Yes

2 No

Now, I have just a few questions for classification purposes.


D1


What is your age?


________ Years Don’t Know Refused


D2


Could you please tell me how many people live in this household?


________ Don’t Know Refused


D3 How many of these people are 6 years or age or younger?


________ Don’t Know Refused


D4


What is the highest level of education that you have completed? Is it… (READ LIST)


1 Less than high school

2 High school or equivalent

3 High school and technical school

4 One or more years of college

5 Bachelor’s degree

6 Graduate degree


D5


Including everyone living in your household, which of the following categories best describes your total household income before taxes? Is it … (READ LIST)


1 $10,000 or Less,

2 Between $10,001 and $20,000,

3 Between $20,001 to $35,000,

4 Between $35,001 to $50,000,

5 Between $50,001 to $75,000,

6 Between $75,001 to $100,000, or

7 More than $100,000

8 (VOL) Don’t know

9 (VOL) Refused


D6


Are you of Hispanic or Latino origin?


1 Yes

2 No

8 (VOL) Don’t know

9 (VOL) Refused


D7


Which of the following racial categories describes you? You may select more than one. Would it be… (READ LIST – MULTIPLE RECORD)


1 American Indian or Alaskan Native,

2 Asian,

3 Black or African American,

4 Native Hawaiian or Other Pacific Islander, or

5 White

6 (VOL) Hispanic / Latino

8 (VOL) Other

9 (VOL) Refused


D8


Do you… (READ LIST)


  1. Rent your home or apartment

  2. Own your own home

  3. Live with family or friends and pay part of the rent or mortgage

  4. Live with family or friends and do not pay rent

7 (VOL) Other, Specify

8 (VOL) Other

9 (VOL) Refused


D9


What is your zip code? ___________


D10


Are you currently employed?

1 Yes

2 No


D11


Are you currently employed in the commercial fish industry?


1 Yes SKIP TO CLOSING

2 No


CLOSING:


Thank you very much for your time, and have a great evening/day


1The stated preference format was chosen because it is the only method that allows the estimation of non-use values, which are potentially significant in this case.

2 Non-user values are by definition non-use values. Users can also hold non-use values. However, user’s non-use values may differ from non-users values due to familiarity with the resource. Thus, non-user values are not necessarily representative of average non-use values.

3 The environmental attributes to be compared against the cost of living increases where designed based on the Johnston et al. (2009) Bioindicator-Based Stated Preference Valuation (BSPV) method which was developed to promote ecological clarity and closer integration of ecological and economic information within SP studies. In contrast to traditional SP valuation, BSPV employs a more structured and formal use of ecological indicators to characterize and communicate welfare-relevant changes. It begins with a formal basis in ecological science, and extends to relationships between attributes in respondents’ preference functions and those used to characterize policy outcomes. Specific BSPV guidelines ensure that survey scenarios and resulting welfare estimates are characterized by: (1) a formal basis in established and measurable ecological indicators, (2) a clear structure linking these indicators to attributes influencing individuals’ well-being, (3) consistent and meaningful interpretation of ecological information, and (4) a consequent ability to link welfare measures to measurable and unambiguous policy outcomes. The welfare measures provided by BSPV method can be unambiguously linked to models and indicators of ecosystem function, are based on measurable ecological outcomes, and are more easily incorporated into benefit cost analysis. This methodology was developed in part to address the EPA Science Advisory Board’s call for improved quantitative linkages between ecological services and economic valuation of those services.

4 Actual response rates could vary across study regions.

5 EPA plans to complete focus group testing of the instrument before the second Federal Register notice of this information collection request. The inclusion of all four attributes is an important aspect of focus group testing. If focus groups find this number cognitively challenging, the number will be reduced, and if cognitive issues are minimal as identified by randomly selected focus group participants, all four will remain.


File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR INFORMATION COLLECTION REQUEST FOR
AuthorAbt Associates
Last Modified ByEPA
File Modified2012-02-09
File Created2012-02-08

© 2024 OMB.report | Privacy Policy