Supporting Statement A
Atlantic Offshore Wind Energy Development–Public Attitudes, Values, and Implications for Tourism and Recreation
OMB Control Number 1010-XXXX
Terms of Clearance: None.
1.
Explain the
circumstances that make the collection of information necessary.
Identify any legal or administrative requirements that necessitate
the collection.
The Bureau of Ocean Energy Management (BOEM) is responsible for conducting Outer Continental Shelf (OCS) lease sales and for monitoring and mitigating adverse impacts that might be associated with offshore energy development. The Energy Policy Act of 2005 (42 U.S.C. 13201 et seq.) authorizes the Secretary of the Interior to issue leases, easements, and rights-of-way for offshore renewable energy activities in Federal waters, such as offshore wind power development. BOEM’s Office of Offshore Renewable Energy oversees the leasing and planning process and promulgated the Final Renewable Energy Framework regulations in April 2009 (30 CFR 585). We are required under multiple statutes (the Outer Continental Shelf Lands Act (43 U.S.C. 1331-1356), the National Environmental Policy Act (42 U.S.C. 4321 et seq.), and the National Historic Preservation Act (16 U.S.C. 470 et seq.)) to take into consideration the impacts of OCS activities on recreational and cultural resources.
While there has been significant interest in offshore wind power development in recent years, the absence of baseline data for specific areas along the Atlantic coast and the absence of a broader regional study on tourism and wind power have made it difficult to identify and analyze the potential impacts of offshore wind development on coastal tourism and recreation. Additional information on these potential impacts will contribute to better planning and decision making for BOEM and other stakeholders, including other Federal agencies and State and local governments.
Under a cooperative agreement awarded by the Department of the Interior, the University of Delaware will conduct a survey to assess the impact of offshore wind power projects on coastal recreation and tourism from Massachusetts to South Carolina. The survey will gauge public perceptions of offshore wind energy projects and how development could impact future recreation and visitation choices. This survey and its accompanying analysis will build upon past studies (see response to question 4 below) and allow us and other stakeholders to better understand the impacts of wind facility development. It will also help us in the preparation of environmental compliance documents related to offshore renewable energy development and leasing. Additionally, the survey with its accompanying research could contribute to Atlantic marine spatial planning efforts, provide additional information for our Renewable Energy State Task Forces, and assist coastal planners along the Atlantic seaboard.
2. Indicate how, by whom,
and for what purpose the information is to be used. Except for a new
collection, indicate the actual use the agency has made of the
information received from the current collection. Be specific. If
this collection is a form or a questionnaire, every question needs to
be justified.
We will use the information from this survey to gauge public perceptions of offshore wind energy projects and to estimate how development could affect future recreation and visitation choices. We will assess the impact for east coast states from Massachusetts to South Carolina, including the impacts of wind projects at varying distances offshore. The impact will also be assessed by different types of beach (e.g., natural versus developed). The primary impacts of interest are whether the presence of a wind project will make a person’s beach experience worse or better and if it would cause a person not to visit a beach. For those choosing not to visit a beach, we are also interested in what they would do instead–go to another beach, engage in some other form of recreation, and so forth. We will analyze these effects in the context of an economic model that will provide an assessment of impact on economic welfare.
We will use this information,
along with other economic and environmental information, in our
offshore wind decision-making process and marine spatial planning
efforts. States and coastal communities will use the information for
local coastal planning efforts.
The data collection will be done by an internet-based survey and covers beachgoers and non-beachgoers. The survey is in five parts. The first part is an introduction. First, we screen for individuals who have made a trip to an ocean beach on the east coast in the last 12 months. Second, we include a question about a person’s beach activities, which is used to help define what we mean by beach use and to classify people by their uses of the beach -- activities on the water, activities in the sand, activities in nearby communities, and so forth. We also gather data on whether or not people own a secondary residence near the beach and whether or not they have family or friends that own property near the beach. These “beach property” questions are important because access to a beach cottage or home near the beach plays a large role in the choice of which beach to visit.
In the second part of the survey we ask people to report all of the beaches they have visited on the east coast in the last 12 months. Then, we randomly draw one of these beaches for a series of detailed questions about that trip–when, type of trip, activities while there, and so forth.
The third part of the survey
is a simulation of the offshore wind project. Here we ask people to
imagine that they are on the beach randomly chosen in the second
section and then show them a simulation (picture with panning) of a
wind project offshore. The project shown is randomly drawn from 2.5,
5, 7.5, 10, 12.5, 15, 17.5 and 20 miles offshore. Respondents view
the project on a clear day, a hazy day, and at night. They are also
shown simulations without wind projects for comparison. Following the
simulation we ask individuals how the presence of the wind project
would affect their beach experience and if it would have caused them
to cancel their trip. If they report cancel, we ask what they would
have done instead. We also ask how the presence of a wind project
located on a beach they had not visited might have affected their
beach going--specifically if it would have caused them to take an
extra trip to see the project. We are interested in changes in
visitation patterns, even if temporary, due to curiosity associated
with viewing a new project. We also ask individuals about their
willingness to pay to take a guided tour of the project to get an
idea of potential demand for recreation use along these lines as
well. Finally, we ask individuals for their opinion on the potential
effects of offshore wind projects on beach use and tourism
generally-not on their own trips, but on everyone’s trips.
The fourth section collects data on the number of day, short overnight, and long overnight trips to each of the beaches reported in section 2. These data will be used to estimate the travel cost random utility model. We also ask people to report mode of transit used most often to reach the beach and share of travel expenses they pay. These will be used to estimate travel costs.
The fifth section closes with a series of demographic and attitudinal questions. Again, GfK will provide most of the demographic variables needed since these are available for all members of their panel. Here we gather data on use of vehicle access on the beach, extent to which people view themselves as an environmentalist, and some addition income information. Our data on wage, income, etc., is perhaps more detailed than usual because we are using this information to estimate individuals’ value of time. No personally identifiable information is collected.
The structure of the survey is
similar for non-beachgoers (individuals not visiting a beach in the
last 12 months). People who report they rarely or never visit
beaches are asked if the might take a special trip to see a wind
project, but are not asked about their last beach trip since they
presumably did not have one. People who visit beaches but did not go
last year are asked how they might have reacted to a wind project on
their last trip.
3. Describe whether, and
to what extent, the collection of information involves the use of
automated, electronic, mechanical, or other technological collection
techniques or other forms of information technology, e.g., permitting
electronic submission of responses, and the basis for the decision
for adopting this means of collection. Also describe any
consideration of using information technology to reduce burden and
specifically how this collection meets GPEA requirements.
The
survey is internet-based. We decided to use an internet-based
approach in part to improve the images respondents are shown. The
internet also allows us to easily accommodate different skip patterns
and variation in wind projects shown to respondents. These are also
possible in mail surveys, but the ease and number of variations we
can consider is greater in an internet setting.
4.
Describe efforts to identify duplication. Show specifically why any
similar information already available cannot be used or modified for
use for the purposes described in Item 2 above.
There have been a number of studies in the United States on tourism-related impacts due to offshore wind power: Lilley et al. (2010) in Delaware, Schulman and Rivera (2009) in New Jersey, and Landry et al. (2012) in North Carolina. Lilley et al. (2010) and Schulman and Rivera (2009) are more attitudinal-based, do not explicitly consider an economic model of behavior, and cover only a few beaches. Landry et al. (2012) estimate an economic model and is a good starting point for our analysis. They consider only North Carolina beaches, consider only projects at two distances from shore, and use a less sophisticated choice model than we propose. Our analysis will cover a broader region (Massachusetts to South Carolina). We will also use improved images of wind projects and consider how impacts to vary by user groups and types of beaches. Finally, our model is a more sophisticated economic model for understanding substitution and behavioral reaction to wind projects than Landry et al. (2012), and this is critical to understanding the impact of wind projects on beach use and tourism.
Simply modifying the existing studies for BOEM’s purposes would compromise the accuracy and coverage over the east coast too much for our purposes and may be difficult to defend in the final analysis where actual wind decisions are being made.
There are also a number of stated choice analyses looking at impacts on coastal residents (see Ladenberg (2009) or Krueger et al. (2011) for reviews) as well as attitudinal studies on wind power on and offshore (see Firestone et al. (2009) for an example). None address the issue we are concerned with, which is impacts on coastal recreation.
Firestone, J., W. Kempton, and A. Krueger (2009). “Public Acceptance of Offshore Wind Power Projects in the USA.” Wind Energy 12: 183-202
Krueger, A., G. Parsons, and J. Firestone (2011). “Valuing the Visual Disamenity of Offshore Wind Power Projects at Varying Distances form the Shore.” Land Economics 87(2): 268-283
Ladenberg, J. (2009). “Stated Public Preferences for On-land and Offshore Wind Power Generation: A Review.” Wind Power 12 (2): 171-181
Landry, C.E., T. Allen,
T. Cherry, and J.C. Whitehead
(2012). “Wind
Turbines and Coastal Recreation
Demand”
Resource and Energy Economics 34: 93-111.
Lilley, M. , J.
Firestone, and W. Kempton (2010). “The
Effect of Wind Power Installations on Coastal Tourism”
Energies 3(1):
1-22
Schulman,
S. and
J. Rivera (2009).
“Survey of
Residents
& Visitors in Four Communities Along
the Southern
New Jersey Shore” Report prepared for Fisherman Energy, LLC.
William J. Hughes Center for Public Policy, Richard Stockton College
of New Jersey.
5. If the collection of
information impacts small businesses or other small entities,
describe any methods used to minimize burden.
There
are no impacts on small businesses or other small entities.
6.
Describe the consequence to Federal program or policy activities if
the collection is not conducted or is conducted less frequently, as
well as any technical or legal obstacles to reducing burden.
Without the data collection
and analysis proposed here, we would be forced to proceed with
offshore leasing decisions and other marine spatial planning efforts
with less information on impacts. This may delay decision making,
lead to poorer leasing decisions (being based on incomplete
information), and even make us noncompliant with our guiding
legislation that calls for taking into consideration the impacts of
OCS decisions on recreational and cultural resources.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
* requiring respondents to report information to the agency more often than quarterly;
* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* requiring respondents to submit more than an original and two copies of any document;
* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
There are no special circumstances that require us to collect the information in a manner inconsistent with OMB guidelines.
8. If
applicable, provide a copy and identify the date and page number of
publication in the Federal Register of the agency's notice, required
by 5 CFR 1320.8(d), soliciting comments on the information collection
prior to submission to OMB. Summarize public comments received in
response to that notice and in response to the PRA statement
associated with the collection over the past three years, and
describe actions taken by the agency in response to these comments.
Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
We
published a notice in the Federal
Register on July 1,
2014 (79 FR 37348), soliciting public comments for 60 days. The
comment period ended on September 2, 2014. We received comments from
one person.
Comment: The location of residence (primary or secondary) should be given as a zip code. The zip code then determines the city, State, and distance to beach. There is no need for the respondent to guess what the distance is.
Response: The distance question has been deleted.
Comment: Offshore wind farms is a mature technology. A simple google image search shows a variety of real photos of wind farms off Denmark and the UK. Consider the use of real pictures in place of simulated offshore wind turbines.
Response: We are particularly interested in the impact on beach use and tourism of wind projects at different distances offshore. It is not feasible to find pictures of existing projects at different distances while keeping other features constant (e.g., number of turbines, size of turbines, beach appearance, production quality for presentation on the Internet, etc.). The simulations allow us to “move wind projects” to different distances holding all other features constant. We also are interested in specific turbine sizes (larger than most of the existing ones) and turbine numbers (also larger than most existing projects). We also want to use beaches on the Atlantic coast for our shots. The coastlines in Europe where turbines exist are very different from the coastline in the United States.
Comment: The geology of the Atlantic OCS indicates it is a natural gas province. For example in the 1970s, there was a natural gas discovery off the coast of Atlantic City, New Jersey. Natural gas production accidents do not yield oil and tar balls. A better hypothetical would be beach closures from hurricanes and nor’easters. The respondents should be familiar with these kinds of events.
Response: These hypothetical beach closure questions have been dropped altogether.
Comment: There is a question asking for personal annual income from working. There are many who have considerable income without working. Is it the intent not to capture this information? They have the time and the resources to be frequent ocean beach users.
Response: The income question has been changed to read: “Which category is closest to your personal annual income before taxes?”
Comment: The stratum sample sizes for the survey gives the appearance of being arbitrary. Consider that New Jersey & Delaware has a stratum of population of 8.8 million with a sample size of 200 participants. That works out to 22.73 participants per million. Compare to Pennsylvania 10.4 million population with 150 participants which is 14.42 participants per million. So citizens of Delaware are about 50% more likely to be selected as compared to Pennsylvania citizens. For full disclosure the University of Delaware is conducting the survey and I am a resident of Pennsylvania who is also a property owner in New Jersey. Further someone in Memphis, TN, is part of the survey universe, however someone living in Vermont is excluded. I have family members who live in Vermont and frequently visit the Jersey Shore.
Response: Based on this comment and comments from others we have redesigned the sampling strategy to include two separate samples: a General Population Sample and an Oversample Sample. The former is a random draw from all individuals in the 20 states in our region (now including Vermont, New Hampshire, Maine, and Georgia) and the latter is a random draw from all beachgoers in the same states. Since both of these samples are randomly drawn, the representation is proportional to state populations.
Comment: A good property of selected stratum is to have homogeneity within the stratum (http://en.wikipedia.org/wiki/Stratified_sampling). The use of New York state as a stratum fails this principal. There is Long Island which is the beach community. New York City a major city with near by ocean beaches. Up state New York has ocean beaches which are more distant. Does not make sense to put Hampton’s and Buffalo in the same stratum!
Response: See comment to previous question. We no longer stratify by state.
Comment: The total sample size for the participants of 1,400 is reasonable for obtaining summary insights. The data collection includes attributes, such as distance to the beach, education, number of children, employment status and income. If this survey has a goal of obtaining insights at this kind of granular level then the sample size will need to be adjusted to meet these goals.
Response: Our budget limits us to the sample size we are using.
Comment: The statistical survey design should follow Dillman’s Tailored Design Method (http://www.amazon.com/Internet-Phone-Mail-Mixed-Mode-Surveys/dp/ 1118456149/ref=dp_ob_title_bk). This is the approach that is being used by BOEM in Alaska in the Arctic Communities Survey.
Response: Our survey follows Dillman’s method fairly closely. It may depart in a few instances based on our own judgment calls, but it is largely based on Dillman.
Comment: The commenter made the following recommendations:
Establish clear goals for the information collection, which then drives the design.
Use Dillman’s Tailored Design Method.
Create stratums that are approximately homogeneous. Suggested stratums: Near Ocean Beaches (SC coast, Outer Banks, Tidewater VA, Delmarva, Jersey shore, Long Island, Rhode Island, Cape Cod), Metro Areas (Washington, Baltimore, Philadelphia, New York City, Boston metro areas), Inland (Other parts of SC, NC, VA, MD, Central PA, NJ, CT, MA), Distant Areas (OH, WV, TN, KY, Western PA, Upstate NY, VT, NH).
Use zip codes for location of respondents.
Publish the raw data so it can be independently analyzed.
Response: We addressed most of the recommendations in our responses. As noted, our survey was designed with a specific economic model in mind -- a travel cost model; we use Dillman’s approach fairly closely, but not always; we no longer stratify by geography; and we will use zip codes for location of the respondents. In addition, we plan to publish the raw data.
In addition to the Federal Register notice, we conducted a pretest (see item 4, Supporting Statement B) to estimate the time it would take to complete the survey. Our burden estimates in item 12 reflect the results of the pretest.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
GfK has an incentive system for respondents, which is described in the GfK attachment.
10. Describe
any assurance of confidentiality provided to respondents and the
basis for the assurance in statute, regulation, or agency
policy.
The
University of Delaware followed its human subjects protocol, which is
standard across major universities. The survey will be administered
under the Department of Health and Human Services Guidelines (45 CFR
46) when applicable.
The second page of the survey tells respondents that the information collected in the questionnaire is anonymous and that participation is voluntary. No personal names, birthdates, or social security numbers will be collected on the survey form. We have no way of identifying any person who participates in the survey and respondents will know that.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
We will not ask questions of a sensitive nature.
12. Provide estimates of the hour burden of the collection of information. The statement should:
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for
collections of information, identifying and using appropriate wage rate
categories. The cost of contracting out or paying outside parties for
information
collection activities should not be included here.
Our analysis will use two samples: a General Population Sample and a Beachgoer-Only Sample. The same survey will be administered for both samples. GfK Custom Research (GfK) (formerly Knowledge Networks) will conduct the survey using their Knowledge Panel -- a nationally representative and probability-based online panel.
We will discuss the two samples separately. Also, see our response to question 1 on Supplement B for more detail.
General Population Sample
We will solicit responses from 588 people in the General Population Sample. Of these we anticipate that 88 will either not respond or will drop out of the survey. We estimate an average time for these people of approximately 2 minutes based a pretest of a similar survey. The remaining 500 people will complete the full survey and will include beachgoers and non-beachgoers. For these respondents we expect a completion time of 15 minutes. This gives a total burden of 128 hours.
Beachgoer-Only Sample
We will solicit another 5,378 people in the Beachgoer-Only Sample. Of these, 3,778 will either not respond or will drop out of the survey (most of the dropouts will be non-beachgoers). We estimate an average time for these people of 3 minutes based on a pretest of a similar survey. The time for this sample is somewhat higher than for the General Population Sample because there are more dropouts than non-respondents. The remaining 1,600 people will complete the full survey, and will all be beachgoers. For these respondents we expect a completion time of 15 minutes. This gives a total burden of 589 hours.
The total dollar value of the annual burden hours is approximately $22,736 (see table below). We used the Bureau of Labor Statistics news release USDL-15-0386, March 11, 2015, Employer Costs for Employee Compensation—December 2014, to estimate average hourly wages and calculate benefits. Table 1, states an hourly rate of $22.65 for all workers. To calculate benefits, we multiplied the hourly rate by 1.4, resulting in an hourly cost factor of $31.71.
Total Annual Hour Burden
Activity |
Annual Number of Responses |
Average Completion Time per person |
Total Annual Burden Hours |
Total $ Value of Burden Hours $31.71/hr* |
General Population Sample |
|
|
|
|
Non-respondents & Dropouts |
88 |
2 minutes |
3 |
$95 |
Respondents |
500 |
15 minutes |
125 |
3,964 |
Total |
588 |
17 minutes |
128 |
$4,059 |
|
|
|
|
|
Beachgoer-Only Sample |
|
|
|
|
Non-respondents & Dropouts |
3,778 |
3 minutes |
189 |
$5,993 |
Respondents |
1,600 |
15 minutes |
400 |
12,684 |
Total |
5,378 |
18 minutes |
589 |
$18,677 |
|
|
|
|
|
Overall Total |
5,966 |
|
717 |
$22,736 |
13. Provide an estimate of the total annual non-hour cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)
* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
* Generally, estimates
should not include purchases of equipment or services, or portions
thereof, made: (1) prior to October 1, 1995, (2) to achieve
regulatory compliance with requirements not associated with the
information collection, (3) for reasons other than to provide
information or keep records for the government, or (4) as part of
customary and usual business or private practices.
We have not identified any non-hour cost burdens.
14. Provide
estimates of annualized cost to the Federal government. Also,
provide a description of the method used to estimate cost, which
should include quantification of hours, operational expenses (such as
equipment, overhead, printing, and support staff), and any other
expense that would not have been incurred without this collection of
information.
The
University of Delaware will conduct the analysis. The total cost of
the cooperative agreement to the Federal Government through BOEM is
$200,000. About $31,000 of this is survey expense, including the
photo simulations. Another $75,000 is for faculty and graduate
student time plus $20,000 for benefits for faculty and students.
Another $4,000 is for domestic travel. And finally, about $70,000 is
overhead.
15. Explain the reasons for any program changes or adjustments in hour or
cost burden.
This is a new collection of
information.
16. For collections of
information whose results will be published, outline plans for
tabulation and publication. Address any complex analytical
techniques that will be used. Provide the time schedule for the
entire project, including beginning and ending dates of the
collection of information, completion of report, publication dates,
and other actions.
The results will be published in economics and policy journals (e.g., Land Economics, Marine Resource Economics, and Energy Policy) and presented at professional meetings (e.g., American Economics Association Meetings and Summer Meetings of the Association of Environmental and Resource Economists.) Also, a report and presentations will be given to BOEM staff and a PhD dissertation based on the research that will be completed at the University of Delaware.
17. If seeking approval to
not display the expiration date for OMB approval of the information
collection, explain the reasons that display would be inappropriate.
We will display the OMB
control number, expiration date, and PRA statement on the survey.
18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."
There are no exceptions to the
certification statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement A |
Author | djbieniewicz |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |