2484ss01 A rev_1_15_2014

2484ss01 A rev_1_15_2014.docx

Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona (New)

OMB: 2080-0080

Document [docx]
Download: docx | pdf

Supporting Statement for:


Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona


January 15, 2014


1. Identification of the Information Collection.


1(a) Title of the Information Collection


Willingness To Pay Survey for Santa Cruz River Management Options in Southern Arizona (New), EPA #2484.01, OMB #2080-NEW


1(b) Short Characterization/Abstract


The USEPA Office of Research and Development is investigating how urban households value scenarios of change for perennial reaches of the effluent-dominated Santa Cruz River, Arizona. These values will be explored via a willingness to pay mail survey instrument. There are two effluent-dominated perennial reaches considered in the survey. A “South” reach which starts at an outfall in Rio Rico, AZ, flowing northward through Tumacácori National Historical Park. A “North” reach, fed by two outfalls in northwest Tucson, Arizona, flows northwest through Marana, AZ. In other locations north of where the channel crosses the border with Mexico about 5 miles east of Nogales the Santa Cruz River is ephemeral. For each of the South and North reaches, two different scenarios of change are considered. The first is a reduction in flow length, and associated decreases in cottonwood-willow riparian forest, a rare forest type in the region. The second is an increase in water quality to allow full contact recreation, such as submersion, at normal flow levels. The baseline flow length and forest acreages, as well as the acreages of forest that would be associated with reduced flow lengths, are derived from natural science information and modeling. A choice experiment framework is used with statistically designed tradeoff questions, where options to maintain flow length and forest, or increase effluent water quality, are posed as increases in a yearly household tax. Each choice question allows a zero cost “opt out” option. The choice experiment is designed to allow isolation of the value of marginal change for each reach. A few additional questions to further understand the motivations for respondent choices, as well as their river-related recreation behavior, are also included. Several pages of background introduce the issue to respondents. Limited sociodemographic questions are included to compare sample respondent sociodemographics with the target population. Samples of the two major metropolitan areas in southern Arizona, Phoenix and Tucson, will receive the survey. The survey draft is attached as Appendix 1.


2. Need for and use of the Collection


2(a) Need/Authority for the Collection

Current ORD research revolves around the theme of sustainability (USEPA, 2013a). An overarching goal cited on the USEPA website for sustainability research is:

“EPA Sustainable communities research is providing decision tools and data for communities to make strategic decisions for a prosperous and environmentally sustainable future, and providing the foundation to better understand the balance between the three pillars of sustainability- environment, society and economy” (USEPA, 2013b).

As part of exploring the “balance” of sustainability, this survey research will estimate the monetary value a sample of urban households place upon different management options of the Santa Cruz River in southern Arizona. The Santa Cruz watershed is a subject of continuing research collaboration between USGS, ORD, and other partners, with a peer-reviewed research plan published by Norman et al. (2010). ORD is also collaborating with recipients of a recent National Science Foundation grant engaging in Santa Cruz River natural and social science research, led by the University of Arizona (NSF, 2010). The survey will gather a sample of public input on Santa Cruz River management scenarios to complement partnering natural science research.


2(b) Practical Utility/Users of the Data


The primary reason for the proposed survey is exploratory research. The Santa Cruz River offers a case study of a waterway highly impacted by human modifications, and partly within an urban area. Despite such impacts the Santa Cruz River potentially still represents valuable ecological goods such as rare riparian habitat and recreational opportunities for the regional population. The survey is designed to deliver exploratory research on public input rather than definitively answer a policy question regarding Santa Cruz River management. The research design limits the population to be sampled to two urban populations (Phoenix and Tucson) and limits the choice experiment to just four environmental attributes. Furthermore there is expected to be non-response bias in the results, given an expected 30% response rate.


3. Non duplication, Consultations, and Other Collection Criteria


3(a) Non duplication


As exploratory research, the survey is designed to investigate issues in the environmental valuation literature. In particular, the case study features an urban ecosystem which is less represented in valuation survey research. Secondly, the case study would investigate values for two different ecological resources held by samples of two different urban areas. Thirdly, the survey will explore values for a recreation-oriented attribute (water contact) as compared with a value not dependent on traditional recreation use (wet river ecosystem preservation). The goals of this exploratory survey research are listed below (and later reiterated in part B of the supporting statement):


  • To estimate values for changing the extent of the flow mileage and associated forest vegetation acreage along the effluent-dominated Santa Cruz River.

  • To estimate values for full contact recreation in the effluent-dominated Santa Cruz such as submersion, as a change from partial body contact recreation such as wading.

  • To compare estimated values for changing a recreation-oriented attribute with values for changing the extent of the wet river ecosystem.

  • To provide a case study for estimating values for modifying river attributes of a waterway highly impacted by urban processes.

  • To compare estimated values for changing attributes of two different reaches of the Santa Cruz River, the South and the North. The South has more forest acres per mile of river flow, but is further away from the population centers sampled.

  • To compare estimated values between two population centers, the Phoenix metro area (which is relatively far away from the Santa Cruz River), and the Tucson metro area (which is relatively close to the Santa Cruz River).

  • To learn about river-related recreation habits of the sample, and how these habits as well as sociodemographic characteristics influence values for the changes in the attributes.


Willingness to pay survey research does exists for other US rivers, as well as other southwestern US rivers (e.g. Weber and Stewart, 2009; Berrens et al., 2000). While techniques of benefit transfer could be applied to the results of prior studies in an attempt to gain limited insights on the above objectives, there are a number of hurdles to valuation estimates derived by benefit transfer (e.g. Desvousges et al., 1992, Brouwer, 2000). Differences in the ecological good being considered, local availability of substitute resources, and local tastes and preferences are some of the limitations in transferring estimates from one study or group of studies to a new valuation context. EPA ORD does not believe benefit transfer would match the insights that would be gained from an original study. The proposed survey incorporates natural science modeling of the relationship between surface water and riparian forest. The ecological changes respondents would value are tailored to the Santa Cruz River, have been explicitly defined in the survey, and the survey has been extensively pretested on the regional population to minimize cognitive problems.


EPA has not identified any other studies that have a study design matching or substantially overlapping with the proposed survey. The survey options were specifically designed to encompass changes to both the South and North effluent-dominated perennial reaches. The options were also carefully defined to specify miles of flow, acreage of riparian forest, and safety of water contact for different types of recreation. The language, graphics, and question formats in the survey were carefully pretested. However, it should be noted that there is prior willingness to pay survey research pertaining to the South reach of the Santa Cruz River (Frisvold and Sprouse, 2006). Although full documentation and results of that survey are neither available nor published, further information on that survey was sought before initiating plans for this new collection (personal communication with G. Frisvold, August, 2009). It was determined that the prior survey, even if results become available, would not be sufficient for the Santa Cruz research goals of this study. This survey will ask respondents to choose between numerically defined levels of environmental attributes matching natural science quantitative modeling (alongside varying cost levels). In contrast, the prior survey asked a more general question of willingness to pay “to permanently preserve the Santa Cruz River habitat as it is today”, using photos to describe a baseline perennial stream habitat, and a changed habitat without perennial flow. The specific marginal change to be valued was not defined, limiting the ability to apply results to the gradient of possible management scenarios. Furthermore, a key geographic difference is that the prior survey focused on the South Santa Cruz only, whereas this survey considers management changes to both the South and North Santa Cruz River, and the reaches have significant differences in both vegetation and proximity to urban households to be sampled.


3(b) Public Notice Required Prior to ICR submission to OMB


This is the second of two federal register notices. The first public notice period was 60 days and closed on July 8, 2012.


3(c) Consultations


The principal investigator for this effort is Matthew Weber, postdoctoral researcher at USEPA, ORD, Western Ecology Division, Corvallis, OR. The principal investigator has past direct experience with willingness to pay survey research, with a study estimating values for management changes for the river and riparian area of the Rio Grande in Albuquerque, New Mexico (Weber and Stewart, 2009). Previously approved OMB surveys were consulted in designing this survey, in particular a NOAA coral reef valuation study (OMB # 0648-0585), a USEPA study on fish and aquatic habitat impacts from cooling water intake structures (OMB # 2020-0283), and a USEPA study on Chesapeake Bay water quality (OMB # 2010-0043). The survey instrument booklet format and several questions were adapted from the USEPA study OMB # 2020-0283. The principal investigator participated in a workshop amongst stated preference survey practitioners working on federal government projects, convened by NOAA and Stratus Consulting in June of 2012 (NOAA and Stratus Consulting, 2012). This workshop was a helpful forum for comparing notes in willingness to pay survey design, with an emphasis on strategies for presenting ecological goods in a way meaningful to the lay public. Several completed or working draft willingness to pay survey instruments were presented for group discussion.


Although the need to clearly define the ecological good to be valued is established advice for stated preference research (Arrow et al., 1993), Boyd and Krupnick (2013) found that practitioners sometimes use only vague language to define water-related ecological commodities.

This survey uses an explicit approach to defining ecological commodities to be valued, following concepts described in Boyd and Banzhaf (2007). Ecological goods to be valued were derived from extensive focus group research during survey development. Throughout the focus group research Dr. Paul L. Ringold, a research ecologist at USEPA ORD, was consulted for his experience identifying publicly valued stream commodities and metrics (Ringold et al., 2009, and Ringold et al., 2013).


It should be noted that although all changes in the survey are theoretically possible, there is no specific policy proposed by any group or interest that this survey attempts to research. In general the attributes were driven primarily by selecting a subset of rivers and stream attributes expressed in locally convened focus groups, for which plausible scenarios and choice information could be developed for the Santa Cruz River. In terms of actual local policy discussions, possibilities for changing flow and forest appear to have been more discussed than changes that would allow safe full body contact.


This survey includes natural science modeling outcomes which quantify the relationship between riparian forest acreage and surface flow extent in both the South and North Santa Cruz River, two river areas (North vs. South) with different hydrogeological conditions. The natural science modeling is based on dissertation research at Arizona State University (White, 2011) and further analysis by Arizona State University Professor Dr. Juliet Stromberg (personal communication, May, 2013). Drs. White and Stromberg were consulted during development of the natural science background provided in the survey instrument as well as the ecological changes the survey poses. Dr. Thomas Meixner and Ph.D. student Rewati Niraula both at the University of Arizona were consulted on potential surface streamflow extents under different effluent release scenarios for the North and South reaches of the Santa Cruz River. The most recent Arizona Department of Environmental Quality (2010) report summarizing water quality status for the South and North reaches of river were reviewed. Persons with either research or management interests in the Santa Cruz River (USGS, University of Arizona, Pima County, and Tumacácori National Historical Park) were consulted on the range of likely treated wastewater releases into the South and North reaches of the Santa Cruz River, and the current state of water quality. Notably, zero effluent releases into the North or South Santa Cruz River are considered extremely unlikely and accordingly the survey does not consider this possibility. Region 9 of US EPA was also consulted on the project, including Santa Cruz Watershed contact Jared Vollmer. Although numerous consultations were made, these should not be interpreted as any entity endorsing the survey for management purposes. Again, the overall purpose of the survey is exploratory valuation research being conducted by EPA ORD.


Informal courtesy review comments were specifically solicited from three expert reviewers, two with stated preference experience, and one with experience with Santa Cruz River issues. Their comments are included in Appendix 2, along with a discussion of edits made to the survey.


One comment was received during the first public notice period. This is attached in Appendix 3, along with a discussion of edits made to the survey.


After the second federal register notice three additional comments were received. These comments and a detailed response are attached in Appendix 4, along with a description of resulting edits to the survey.


3(d) Effects of Less Frequent Collection


Without this collection exploratory research regarding willingness to pay for potential Santa Cruz River management scenarios could not be conducted. The management scenarios are specifically designed to be related to prevalent themes of public interest as derived from focus groups. Furthermore, the collection is a rare opportunity to link environmental value research with a highly developed natural science model of surface flow and accompanying forest vegetation available for the Santa Cruz River. The case study is of a highly impacted urban area, a situation for which few environmental valuation references are available. The survey will compare value estimates for two contrasting ecosystem services, for two separate locations, allowing numerous points of comparison of research interest.


3(e) General Guidelines


The survey will not violate any of the general guidelines described in 5 CFR 1320.5 or in

EPA’s ICR handbook.


3(f) Confidentiality


All responses to the survey will be kept confidential. The surveys will be processed, including data entry, by the principal investigator; nobody else will have a record of who has responded or the answers of any given respondent. A list of the addresses of the members of the sample who have responded versus those who have not will be maintained in order to more efficiently mail reminders and replacement surveys. This will be a single file, accessible to and updated only by the principal investigator. To protect confidentiality in survey results, each respondent will be identified by a numeric code in that file rather than their name or address. The survey questions do not ask for any personally identifiable information and personally identifiable information will not be entered in the results even if volunteered by the respondent, for example in the comments section. In the cover letter, respondents will be informed that their responses will be kept confidential. After the data collection is complete, the respondent status file will be deleted, and only the numeric code assigned to each respondent will remain. After data entry is complete, the surveys themselves will be destroyed.


The USEPA ORD office location (the Western Ecology Division of USEPA) and USEPA ORD electronic file system used by the principal investigator are highly secure. A keycard possessed only by ORD employees and contractors is necessary to enter the building. The principal investigator is then in a separate keyed office space within the secure building. The computer system where the personal names and addresses associated with respondent numeric codes will be stored during the process of data entry is a secure server requiring principal investigator personal login username and password. At the conclusion of data entry, this file linking personal names and addresses to respondent codes will destroyed (along with hard copy survey responses themselves) at the conclusion of data entry and only respondent codes will remain.


3(g) Sensitive Questions


In focus groups two questions were found to be sensitive by some of the participants; the question of racial category, and the question of income category. However it was found that describing the research need for these questions, of gauging how well different groups are represented in survey results, was accepted by these participants as a worthwhile reason for asking these questions. The reason for these questions now prominently appears preceding those questions within the survey: "We need the following questions to ensure votes from all groups have been fairly represented in this survey". Confidentiality of responses is then reiterated.


4. The Respondents and the Information Requested


4(a) Respondents/SIC Codes


The target respondents for this survey are representatives 18 yrs or older of households in the two most populated urban areas of Arizona, the Phoenix metro area, and the Tucson metro area. A sample of household representatives 18 yrs or older in each metro area will be contacted by mail following multiple contact protocol in Dillman (2000) and Dillman et al. (2009). A response rate of 30% is expected. Use of multiple mail contacts will be used as a method of increasing sample response rates, and includes a prenotice to all recipients, a main survey mailing, a reminder postcard, and two followup mailings. The target responses from the Phoenix and Tucson metro areas are 250 households each, or 500 households total.


4(b) Information Requested


(i) Data items, including record keeping requirements


The current draft survey is attached as Appendix 1 (note that the pages numbers are out of sequence on the electronic file, they are sequenced so that they will print correctly double-sided). The survey is divided into 4 main parts. The first part is background for the choice questions. The second part is the choice questions themselves. The third part is questions designed to understand the context for why respondents responded to the choice questions as they did. These questions include attitudinal questions as well as recreational preferences questions. The fourth part is designed to compare major sociodemographic categories of the received sample with the population sampled. There are no record keeping requirements.


(ii) Respondent Activities


The following respondent activities are envisioned. Participants will read the cover letter and survey, respond to the survey questions, and return the survey using a provided postage paid envelope. Focus group and cognitive interview participants typically took no longer than 30 minutes to complete the survey, so 30 minutes per response is the estimated burden for the average respondent.


5. The Information Collected–Agency Activities, Collection Methodology, and Information Management


5(a) Agency Activities


Development of the survey questionnaire through focus group and cognitive interview pretesting occurred under the separate ICR# 2090-0028. Pretest techniques follow standard approaches in the qualitative methods literature (Morgan and Krueger, 1998; Rubin and Rubin, 2005), as well as guidance in the economics literature for the specific purposes of pretesting a willingness to pay survey (Johnston et al., 1995; Kaplowitz et al. 2001, Hoehn et al. 2003).


Under this ICR, agency activities will include:

  • Develop and finalize the choice experiment design

  • Obtain a representative sample mailing list for each of the two target metro area populations, Phoenix and Tucson

  • Printing of questionnaires

  • Mailing of prenotices

  • Mailing of cover letters and questionnaires

  • Reminder mailings

  • Follow-up mailings and replacement questionnaires to non-respondents as needed

  • Data entry and quality assurance of data file

  • Analysis of survey results, including characterization of non-response and potential degree of non-response bias

  • Modeling choice experiment results

  • Reporting survey results


5(b) Collection Methodology and Management


The proposed survey is a choice experiment questionnaire delivered and returned by mail. Standard multi-contact mail survey methods will be used to increasing response rate (Dillman, 2000, and Dillman et al., 2009). The desired number of completed surveys is 250 in each of the Phoenix and Tucson metro areas with a target response rate of 30% in each area. Thus it will be necessary to successfully contact approximately 834 households in each metro area. The actual mailing list size will be 1,000 households for each metro area to account for ineligible addresses.


Data quality will be monitored by checking returned survey responses for consistency, and by assessing any comments made on the survey or returned with the survey that signal strategic responses or respondent confusion. Coded survey data will not include any identifying information of the respondents. Returned survey data will be coded and used as the dataset for multinomial logit regression modeling.


5(c) Small Entity Flexibility


This survey will be administered to individuals, not businesses. Thus, no small entities will be affected by this information collection.


5(d) Collection Schedule

A breakdown of the expected collection schedule is as follows:


  • Week 1: Printing surveys

  • Week 2: First contact mailing for pilot survey, notifying that a survey will be mailed in 1-2 weeks

  • Week 3 and 4: Pilot survey mailing

  • Week 5 and 6: Pilot survey reminder postcards mailing

  • Week 7 through 9: Data entry of pilot survey results. Making any necessary revisions to the survey discovered as a result of pilot data, including updating the choice experiment design

  • Week 10: First contact mailing for main survey mailing, notifying that a survey will be mailed in 1-2 weeks

  • Week 11 and 12: Main survey mailing

  • Week 13 and 14: Main survey reminder postcards mailing

  • Week 15 through 18: Main survey additional reminders and replacement surveys as necessary to reach target response rate

  • Week 19 to 20: Data entry


The schedule above is staged such that if response rates are higher or lower than expected, the appropriate number of replacement surveys will be printed and mailed to most efficiently use funds.


6. Estimating The Burden and Cost of the Collection


6(a) Estimating Respondent Burden


For a typical respondent, a conservative estimate of their time to review and respond to survey questions is 30 minutes. Assuming the target of 500 people total respond to the survey, the burden is 250 hours. This would be a one-time expenditure of their time.


6(b) Estimating Respondent Costs

(i) Estimating Labor Costs


The Bureau of Labor Statistics reports average wage rates for some metropolitan areas, with the most recent data being May 2012 (Bureau of Labor Statistics, 2012). The average hourly wage for all occupations in the Phoenix metro area was $21.75, or an average cost per participant of $10.88. The average hourly wage for all occupations in the Tucson metro area was $20.45, or an average cost per participant of $10.23. Assuming 250 participants in each metro area fill out the survey, the total estimated respondent labor cost is $5,275.


(ii) Estimating Capital and Operations and Maintenance Costs


There are no anticipated capital, operations, or maintenance costs associated with this collection.


(iii) Capital/Start-up Operating and Maintenance (O&M) Costs


There are no anticipated capital, operations, or maintenance costs associated with this collection.


(iv) Annualizing Capital Costs


There are no anticipated capital, operations, or maintenance costs associated with this collection.


6(c) Estimating Agency Burden and Cost


The various aspects of the survey mailing are assumed to be done by the principal investigator, with an associated hourly wage rate of $32.50. Preparing survey mailings, tracking non-respondents, sending new mailings as needed, and data entry are anticipated to amount to 8 weeks total or 320 hours of work. Agency labor cost would be 320 hours times $32.50 per hour or $10,400.


6(d) Estimating the Respondent Universe and Total Burden and Costs


Assuming 250 participants in each of the Phoenix and Tucson metro areas fill out the survey, the total labor cost will be $5,270.


6(e) Bottom Line Burden Hours and Cost Tables


Table 1: Burden Hours and Cost Table

Item

Quantity

Cost

Public Burden

Time burden: 0.5 hours per respondent

500 persons

250 hours; $5,275 labor

Agency Burden

Time burden

Entire project

320 hrs; $10,400 labor

Mailing list

1,000 names in Tucson area & 1,000 names in Phoenix area (accounts for % of ineligible addresses)

$800

Prenotice letter paper and printing

1,700 pieces

$100

Prenotice envelopes

1,700 pieces

$150

Prenotice postage (bulk mail)

1,700 pieces

$700

Color surveys paper and printing

2,000 pieces

(includes estimated replacements)

$2,900

Printing return envelopes 10.5” x 7.5”

2,000 pieces

(includes estimated replacements)

$450

Outgoing envelopes 11.5” x 8.75”

2,000 pieces

(includes estimated replacements)

$300

Outgoing survey postage (bulk mail)

2,000 pieces

(includes estimated replacements)

$1,900

Return survey postage (bulk mail)

500 pieces

$500

Reminder postcard paper & printing

1,700 pieces

$100

Total


$23,575


The estimated respondent burden for this study is 250 hours and $5,275. The estimated agency cost for this study is 320 hours and $10,400. Agency costs besides labor hours total $7,900 for the mailing list, paper, printing, and postage.


6(f) Reasons for Change in Burden


The survey is a one-time data collection activity.


6(g) Burden Statement


The annual public reporting and recordkeeping burden for this collection of information is estimated to average 0.5 hours per response. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations are listed in 40 CFR part 9 and 48 CFR chapter 15.


To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID Number EPA-HQ-ORD-2013-0282, which is available for online viewing at www.regulations.gov, or in person viewing at the Office of Research & Development (ORD) Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Avenue, NW, Washington, D.C.  The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays.  The telephone number for the Reading Room is (202) 566-1744, and the telephone number for the ORD Docket is (202) 566-1752.  An electronic version of the public docket is available at www.regulations.gov.  This site can be used to submit or view public comments, access the index listing of the contents of the public docket, and to access those documents in the public docket that are available electronically.  When in the system, select “search,” then key in the Docket ID Number identified above.  Also, you can send comments to the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW, Washington, D.C. 20503, Attention: Desk Officer for EPA.  Please include the EPA Docket ID Number EPA-HQ-ORD-2013-0282 and OMB Control Number 2080-NEW in any correspondence.






File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleICR SUPPORTING STATEMENT OUTLINE
AuthorMDSADM10
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy