2120-0744 Part A 2015

2120-0744 Part A 2015.docx

Human Response to Aviation Noise in Protected Natural Areas Survey

OMB: 2120-0744

Document [docx]
Download: docx | pdf



Human Response to Aviation Noise in Protected Natural Areas

OMB Control Number 2120-0744

  1. Justification

  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

Justification and legal requirements for human response to aviation noise data collection in the National Parks

Congress passed the National Parks Air Tour Management Act of 2000 (NPATMA) effective April 5, 2000 (Public Law 106-181, 114 Stat. 61, Title VIII). NPATMA directed the Administrator of the Federal Aviation Administration (FAA), with the cooperation of the National Park Service (NPS), to develop Air Tour Management Plans (ATMPs) to regulate commercial air tour operations over units of the national park system. The FAA and NPS are jointly developing the Air Tour Management Plans required by this Act, with support from the U.S. Department of Transportation, John A. Volpe National Transportation Systems Center (Volpe Center). Approximately 100 park units will require the development of ATMPs. The ATMPs will prescribe acceptable and effective measures to mitigate or prevent significant adverse impacts, if any, of commercial air tour operations upon the natural and cultural resources of and visitor experiences in National Park units, as well as tribal lands included in or abutting a national park.

In addition to NPATMA, the National Park Service Organic Act of 1916 (16 U.S.C. l 2 3, and 4) requires that resources of National Park units be preserved unimpaired for the enjoyment of future generations. Section 4.9 of the National Park Service Management Policies (2006) applies this requirement to the preservation of acoustic resources in National Park units. Section 8.4 dictates that the National Park Service will take all necessary steps to avoid or mitigate unacceptable impacts from aircraft overflights. Section 5.3.1.7 states that NPS will prevent noise from detracting from historic or cultural resource sounds. Section 8.11.1 states that the NPS will facilitate social science studies that support the NPS mission by providing an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources and further states that the NPS will facilitate social science studies to understand how park visitors experience park acoustic environments.

FAA Order 1050.1E, Environmental Impacts: Policies and Procedures also notes that special consideration needs to be given to the evaluation of the significance of noise impacts on noise sensitive areas within national parks or other designated properties.

Relevant documents are contained in the attachments to this statement. Appendix A provides copies of NPATMA, the National Park Service Organic Act of 1916, NPS Management Policies, Sections 4.9, 8.4, and 8.11.1, and FAA Order 1050.1E.

The proposed research effort will support the development of ATMPs: it will provide the FAA and NPS with critical information on human response to aviation noise in National Parks. The research will quantify the relationship between direct measures of aviation noise and the associated impacts on visitor experiences, as measured through visitor surveys. It will continue to build, and to improve upon, research conducted by the FAA and NPS from 1992 to 1998.i,ii, iii, iv, v The research will contribute to the development of soundscape-based indicators and standards of quality for use within the NPS Visitor Experience and Resource Protection (VERP) framework.vi,vii It will focus on areas subject to NPATMA, but may include other NPS Units which are not currently applicable to NPATMA.

The previous research on human response to aviation noise has been limited in two respects. First, previous studies have looked at a limited subset of park site types (scenic overlook and short hike sites). Visitors to other site types – in particular, backcountry areas – may be especially sensitive to aviation noise, because the areas are remote, and ambient sound levels are often extremely low, making aviation noise all the more noticeable. Consequently, air tour management policies based on human-response data at scenic overlook and short hike sites may not be appropriate for backcountry sites. Second, while multiple survey techniques have been utilized, they have not been performed in the same parks and sites to allow for robust comparison of efficacy and utility.

The FAA and NPS need empirical guidance to support the development of effective and scientifically defensible ATMPs. However, quantifying the impacts of aviation noise on park visitors is complex. Previously-collected data are not sufficient, and there is no consensus on which survey methodology is most effective (nor whether different park settings require different survey methods to measure noise impacts accurately). The proposed research effort addresses these issues. It was initiated by the FAA after a thorough planning and review process, which included workshops attended by agency representatives from the FAA (Western-Pacific Region and Office of Environment and Energy), NPS (Natural Sounds Program, Grand Canyon National Park, and Grand Teton National Park), and the Volpe Center, as well as numerous experts in acoustics, statistics, social science and recreation management.

Re-analysis of previously collected noise dose-response data from the national parks has identified key noise-exposure descriptors and visitor questions that best capture the relationship between dose and response.viii In addition, analyses of past data have identified key mitigating variables that influence visitor response, including: presence of children in the respondent’s group, visiting the park for the first time, and visitors who consider natural quiet to be very important for their visit. This information has improved our understanding of past data collections and is being used to guide the proposed work.

The proposed research expands on previous work in three ways:

  1. For previously studied site types (frontcountry scenic overlooks and frontcountry short hike sites), it provides additional data for low aircraft activity, to (1) obtain statistical significance of one additional (physically important) aircraft noise metric and (2) thereby better justify future application to low-activity time periods. These additional data will also increase the number of specific sites for each site type, to enable better comparisons of site types among park units—thereby more precisely determining site-to-site variability.

  2. It simultaneously tests multiple survey instruments in the same settings to compare methodologies.

  3. It increases the number of site types represented in the survey collection by extending survey collection to activities/site types not previously studied (frontcountry day hikes, frontcountry historical/cultural sites, backcountry day and multi-day hikes, and backcountry camp sites)—thereby determining these site-type “offsets” from the two site types in the current database.

Information collection was originally anticipated to occur over the course of 2 to 3 summer seasons with interviews collected for each of 3 survey instruments (described in detail below) at multiple site types in each park (see site-type lists just above) in at least eight different parks (target of at least four parks per year). The specific sites and park units chosen for study will be based on factors such as the type of site and visitor activity to be represented, aircraft activity (air tour and non air tour), and visitation volume.

To date, we have collected information over the course of 2 summer seasons in 5 different parks. The information collected represents approximately 80% of original goals for backcountry day-hikes, 18% of original goals for backcountry overnights, and 10% of original goals for the frontcountry site types. Dose-response relationships have been successfully developed for backcountry hikes,ix but sufficient data for backcountry overnight visitors and frontcountry cultural/historic visitors has not yet been collected. Data for these site types have not yet been collected due to logistical and funding constraints and the desire to evaluate the initial backcountry data collected. It is anticipated that an additional 2-3 summer seasons in at least 3 additional parks will be required to provide sufficient data for these site-types.

  1. Indicate how, by whom, and for what purpose the information is to be used.

The FAA and NPS will use the information from this collection to derive empirical guidance to support the development of effective, fair, and scientifically defensible Air Tour Management Plans.

Collection Instruments

The research effort will simultaneously administer three variations of park visitor survey instruments, all of which are fully described under question 2 of this Supporting Statement, in a variety of park settings. These instruments and their historical roots are as follows:

  1. The human response to aviation noise - visitor survey, version 1. This is an adaptation of the NPS / FAA / USAF Aircraft Overflight Studies visitor survey (OMB Nos. 1024-0088, 2120-0610, and 0701-0143).

  2. The human response to aviation noise - visitor survey, version 2. This is an adaptation of the NPS Soundscape: Attended Listening survey (OMB No. 1024-0224, NPS No. 07-014); and,

  3. The human response to aviation noise – visitor survey, audio recording evaluation version. This is an adaptation of the NPS Soundscape: Audio Recording Evaluation survey. (OMB No. 1024-0224, NPS No. 07-014).

These three survey instruments are hereafter called “Human Response Survey Instrument 1,” “Human Response Survey Instrument 2,” and “Audio Clip Evaluation Survey Instrument,” respectively. Each of the survey instruments and accompanying acoustic research components has strengths, limitations, and complementary characteristics. This suggests a multiple-methods approach is likely to provide the strongest empirical basis for understanding human response to aviation noise in protected natural areas. Administering these survey instruments simultaneously at a series of site types and locations is intended to provide a consistent basis for comparing the research methods. This comparison will identify whether differences in question location and/or phrasing influences visitor response and to determine whether audio clips give similar responses to actual noise exposure during a site visit. In addition, this comparison will determine whether mediating variables similarly affect responses on the three survey instruments.

Survey question justifications

Questions Common to All Three Instruments

The three instruments have been adapted to share the first five and the last eight questions in order to provide identical evaluations of general visitor characteristics, experience, satisfaction, and demographic variables that may influence overall response. These questions will indicate whether the respondent populations of the surveys are similar, and will allow us to control for the influences of non-acoustic variables (i.e., not related to noise exposure) on visitor response. Detailed descriptions of the questions common to all three instruments are included below.

  1. Human Response Survey Instrument 1

The survey instrument is an adaptation of the instrument used in prior NPS/FAA park studies, specifically those at Haleakala and Hawai’i Volcanoes, Bryce Canyon and Grand Canyon National Parks from 1992 to 1998. The noise exposure of the visitor(s) will be measured from a non-visible location. Observers will identify and time individuals or groups of visitors from entry to the site to the point of interception, allowing for statistical correlation of real-time acoustic noise exposure metrics with visitor response. Individuals or groups will be intercepted, invited to step away from the trail/overlook/other visitors (to reduce disturbance of others by the surveyor and reduce pre-survey bias in potential future respondents) and invited to complete the attached paper surveys, with assistance available from the surveyor. Discussion among respondents will not be permitted during survey response.

Questions 1 - 3 (common to all three survey instruments) collect information about the respondent’s park visit.

Question 1 asks whether the visitor has been to the particular site before, and if so, how many times s/he has visited before. A respondent’s familiarity with the site has proven to be a key explanatory variable in previous studies.

Question 2 asks where the visitor has been within the park. This will provide researchers with a mechanism to check if a visitor has been outside the area where sound level measurements are being collected.

Question 3 provides a measure of the particular activity or activities in which the respondent has been engaged during the visit. Such a measure was not included in prior studies, and it is expected to be an explanatory variable in the relationship between aviation noise and visitor experience evaluations. The activities park visitors engage in may influence their perceptions of, and attitudes about, aircraft noise. For example, a visitor engaged in wildlife watching may be much more sensitive to and impacted by aircraft noise than a visitor browsing the park’s gift store.

Question 4 (common to all three survey instruments) is designed to measure the importance to the respondent of natural quiet (an explanatory variable) and other site factors. Some type of “importance” question is included in most recreational visit surveys. Importance of “natural quiet and the sounds of nature” was found to affect responses in prior research. This question assesses how the importance of natural quiet covaries with response to aircraft noise.

Question 5 (common to all three survey instruments) is designed to assess how closely the respondent’s actual experiences in the park and enjoyment of various aspects of the park, such as natural quiet, cultural and historical qualities, etc. matched their expectations for their visit. This provides a simple assessment of soundscape experience that can be used as a baseline for comparison of visitor populations and more specific noise dose-response variables.

Question 6 measures whether the respondent heard aircraft during their visit both for correlation with noise dose/exposure (e.g., sound pressure level, frequency of events, duration of events, etc.) and to filter out respondents who will not answer further questions on aircraft noise.

Questions 7 through 10 are the ‘core’ of the instrument, and assess the impact of aircraft activity during the respondent’s visit.

Question 7 measures general annoyance in response to aircraft noise. This is the most widely used measure of aircraft noise impact in most environmental noise studies. The wording and format are based on recommendations from the International Standards Organization (ISO).x It is needed for comparison to the NPS surveys performed in the 1990s.

Questions 8 and 9 measure the impact of aircraft noise on the respondent’s overall enjoyment of the site and to what extent it interfered with particular aspects, including natural and historical/cultural aspects, of the respondent’s visit. This uses respondents’ judgments about impact, rather than a direct measure of impact. Question 8 is split in two parts to retain comparability with previous studies, as only the dimensions queried in Question 8 were included in prior aircraft overflight surveys.

Question 10 measures the respondent’s overall interpretation/evaluation of the acceptability of aircraft noise in the context of the National Park. This question will provide a key measure of comparability between the 3 survey instruments, as it is identical in wording to Human Response Survey Instrument 2 Question 6 and the Audio Clip Evaluation Survey Instrument Questions 6a, 7a, 8a, 9a, and 10a.

Question 11 measures whether the respondent saw aircraft during their visit and the level of annoyance in response to seeing aircraft.

Question 12 is an explanatory variable question that provides a basis for determining whether the perceived type of aircraft (fixed wing, jet, helicopter) affects reactions. It also helps to assess the congruence between the types of aircraft present and respondents’ perceptions of types. Since several types may be important, rough estimates of proportions are needed.

Question 13 assesses respondent reaction to the impact of aircraft noise on their experience within the context of other sounds the respondent may have heard at the site during their visit. This question lists aircraft-related noises along with numerous other kinds of sounds. This measures the extent to which positive reactions may have been missed on other questions. In addition it provides some local context for the importance of aircraft noise. The list of sounds is identical to that in Human Response Survey Instrument 2, Question 6.

Question 14 (common to all three survey instruments) asks about respondent participation in air tours. Participation in an air tour may influence a respondent’s perceptions of and attitudes about aviation noise in the park.

Questions 15 through 22 (common to all three survey instruments) collect basic demographic information and characteristics of respondents, such as size and age distribution, that may affect respondent reactions to aircraft noise. In particular, the presence of children in the respondent group (Question 14) has shown explanatory power in prior studies.

  1. Human Response Survey Instrument 2

The survey instrument is an adaptation of an attended listening survey that was administered in parallel with acoustic monitoring and sound-logging in Sequoia-Kings Canyon National Park (SEKI) during summer 2009. This survey will be administered using the same methodology for tracking, timing and interception as described above for Human Response Survey Instrument 1. Instrument 2 will address how question order and other survey-design factors influence visitor response compared to Human Response Survey Instrument 1. Specifically, Survey Instruments 1 and 2 reverse the order of the questions regarding assessment of aircraft noise. In Survey Instrument 1, the question with explicit reference to aircraft noise (Question 7) is presented before the sounds-rating assessment (Question 13). In contrast, Survey Instrument 2 presents the sounds-rating assessment (Question 6) first, and assesses each sound using three evaluative dimensions.


Questions 1-5 are identical to those described in Human Response Survey Instrument 1, above.

Question 6: Respondent’s logging of sounds heard, and evaluations of those sounds. This section of the survey instrument asks respondents to identify the sounds they heard during their visit and to evaluate each sound heard on acceptability, personal interpretation, and experience impact scales. Aircraft sounds are listed in conjunction with a wide range of other types of sounds (e.g., natural sounds, human voices, machinery), with no emphasis on aircraft. This is done to avoid drawing undue attention to aircraft sounds, thereby minimizing the possibility for bias. The section serves multiple purposes concerning human response to aviation noise, as follows:

  • The section asks visitors to report their normative standards for hearing aircraft in national parks by asking respondents to rate the acceptability of aircraft sounds heard. The average acceptability ratings of aviation noise doses are graphed to show a “social norm curve” for one or more acoustic properties of aviation noise dose/exposure (e.g., sound pressure level, frequency of events, duration of events, etc.). The point at which the social norm curve crosses the neutral point of the acceptability scale is the threshold of acceptability (i.e., it is the point at which the sound pressure level/loudness becomes unacceptable).

  • The section provides a measure of the impact (positive or negative) on visitor experiences associated with aviation noise doses and other human-caused and natural sounds.

  • The section provides a basis for assessing the threshold of acoustic properties of aviation noise doses beyond which visitors notice aircraft sounds while visiting the study area.

Question 7 asks whether the respondent heard aircraft during their visit both for correlation with dose and to filter out respondents who will not answer further questions on aircraft noise.

Question 8 measures general annoyance in response to aircraft noise for those respondents who did hear aircraft. This question is identical to Question 7 in Human Response Survey Instrument 1.

Question 9 measures the impact of aircraft noise on the respondent’s overall enjoyment of the site and to what extent it interfered with particular aspects, including natural and historical/cultural aspects, of the respondent’s visit. This uses respondents’ judgments about impact, rather than a direct measure of impact. Question 9 has been used in all the aircraft overflight surveys to date. This question is identical to Question 9 in Human Response Survey Instrument 1.

Question 10 asks visitors to report their attitudes about aircraft management alternatives. In addition to providing another gauge of visitor attitudes toward aircraft noise, this question will provide park managers and other decision-makers with public opinion about the regulation of aircraft overflights in National Parks.

Questions 11-19 are identical to Questions 14-22 as described in Human Response Survey Instrument 1.

  1. Audio Clips Evaluation Survey Instrument

This survey instrument is an adaptation of survey instruments administered in Haleakala and Hawai’i Volcanoes National Parks (2007) and Muir Woods National Monument (2006) based on the “normative research approach.” Normative research is used in national parks and related areas to provide empirical support for application of contemporary planning/management frameworks such as Visitor Experience and Resource Protection (VERP) (developed and used by the NPS) and Limits of Acceptable Change (LAC; developed and used by the U.S. Forest Service). VERP and LAC are nearly identical as both are essentially “management-by-objectives” frameworks. Both rely heavily on formulation of indicators and standards of quality. Instrument 3 utilizes audio clips to provide a predictable, consistent aircraft noise dose to visitors to better measure the relationship between noise dose and response. However, this technique may concentrate the visitor’s attention on aural information to a greater extent than during a real visit to the park. Therefore, it may exaggerate the negative response to aircraft noise. Understanding how the dose-response relation for this survey compares to that with on-site assessments of aircraft noise will determine a “dose-conversion factor” to reconcile responses. Such a factor will then allow use of audio clips in settings where real-time, field-based aviation noise measurements are not feasible.

Questions 1-5 are identical to those in Human Response Survey Instrument 1.

Questions 6-10, are the core questions in this survey instrument. Respondents are asked to listen to and evaluate a series of five, 30-second, pre-recorded audio clips that combine a background layer of natural sounds with varying levels (sound pressure levels/loudness) of aircraft noise. Respondents are asked to rate the acceptability and personal feelings elicited by the sounds in each clip. Each of these dimensions is further described in the paragraph below. This approach gives researchers a high degree of control of the dose/stimuli, and therefore may reduce prediction uncertainty and the potential for measurement error.

In Part a, respondents are asked to rate the acceptability of each sound clip. The average acceptability ratings of each sound clip are commonly used to determine the social norm for this indicator variable.xi This question is designed to be directly comparable to Human Response Survey Instrument 1 Question 10 and Human Response Survey Instrument Question 6.

In Part b, respondents are asked to evaluate their personal feelings elicited by each clip on a scale ranging from extremely pleasing to extremely annoying. It is designed to give comparable results to Human Response Survey Instrument 1 Question 7 and Human Response Survey Instrument 2 Question 6.

Questions 11-13 ask respondents to listen to an additional 30-second audio clip and rate the clip focusing on the potential indicator variable of frequency of hearing aircraft. Question 11 uses the evaluation concept of acceptability (as did question 6) and questions 12-13 use the evaluation concepts of “preference” (what conditions would visitors prefer), “displacement” (what conditions would be so bad that visitors would no longer come to this park). These questions provide a range of thresholds from a very high quality visitor experience (“preference”) to a very low quality experience (“displacement”).

Question 14 asks whether the respondent heard aircraft during their visit to the site, both for correlation with dose and to filter out respondents who will not answer further questions on aircraft noise. This question is identical to Question 6 and 7 in Human Response Survey Instruments 1 and 2, respectively.

Question 15 measures general annoyance in response to aircraft noise for those respondents who heard aircraft during their visits to the site. This question is identical to Question 7 and 8 in Human Response Survey Instruments 1 and 2, respectively.

Question 16 asks visitors to report their attitudes about aircraft management alternatives and is identical to Question 10 in Human Response Survey Instrument 2.

Question 17 is identical to Question 15 as described in Human Response Survey Instrument 1.

Question 18 asks the respondent to evaluate the potential tradeoffs between the personal benefit of taking an air tour and the potential impacts on other park visitors.

Questions 19-26 are identical to Questions 15-22 as described in Human Response Survey Instrument 1.

  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

This information will be collected via on-site surveys and interviews, often in remote, backcountry sites. No automated survey data collection will take place. The normative (audio clips) survey instrument will utilize recorded sound-clips including park ambient noise and varying levels of aircraft noise to provide a range of controlled audio doses. The administration of Human Response Survey Instruments 1 and 2 will require audio recording equipment to measure ambient noise levels and aviation events. Respondents will be unaware of this audio equipment and recording, which will be placed in a discreet location near the sampling location.

  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

Data on aircraft noise impacts derived from previous studies are useful but insufficient, for two primary reasons. First, prior studies surveyed park visitors at frontcountry overlook and short-hike sites, and did not include backcountry and other remote locations. Consequently, park environments where noise impacts may be the most acute are not adequately represented. Second, numerous other surveys that included evaluations of aircraft noise did not collect simultaneous acoustic data. These acoustic data are paramount in the development of scientifically-defensible noise impact criteria. Third, as discussed previously, there is no consensus on which survey methodologies are best suited for particular locations and on survey comparability with regard to accuracy and reliability. The proposed research effort intends to determine this by administering all three types of survey instruments simultaneously (along with the simultaneous collection of acoustic data) at a series of site types and locations. For example, on a given day visitors to a scenic-overlook site could be surveyed, with each of the three survey instruments administered to one-third of the respondents (randomly assigned based on surveyor availability). The proposed work will reduce duplicative and/or ineffective efforts in future survey collections by establishing efficacy and guidelines for data collection and comparability among survey instruments.

  1. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

The data collection will not impact small businesses or other small entities.

  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

Should these data not be collected, ATMPs for as many as 100 parks would have to be developed without sufficient empirical evidence about actual visitor experiences in the parks, particularly for visitors to backcountry and other site types that have not been studied. The resulting regulations of air tour operators might not address the conditions and visitor use context at certain parks effectively, might be too lenient in some cases or unnecessarily stringent in others. There are no technical or legal obstacles to reducing burden, as the burden to any individual respondent is already very low (ten to fifteen minutes).

  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more often than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

These circumstances are not applicable to our collection of data. Our research consists of one-time, on-site park visitor surveys. Therefore, frequency of reporting, preparation or submission of documents, retaining of records, and revealing of trade secrets do not apply. This research includes a survey designed to produce statistically-valid and reliable results. The survey instrument will use only data classifications to be reviewed and approved by OMB. The survey will not collect personally identifiable information. The researchers administering the survey will read an introductory statement to each respondent explaining the information to be collected, and offer a pledge of anonymity.

  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

As stated in Question 1, the FAA and Volpe Center, with the assistance of the NPS, sponsored two workshops with the goal of developing a mutually acceptable and coordinated strategy for continued noise exposure-human response research in the National Parks. These workshops brought together experts in the fields of acoustics, social science and recreation management. At the conclusion of these workshops, the experts were contracted by the Volpe Center to refine and further develop the survey instruments and methods to be utilized in this research. The survey instruments included in this package have been fully vetted among this group.

Consultation with representatives of the survey subjects is not necessary because each respondent is surveyed only one time. The survey researchers will consult with national park superintendents, managers, and rangers, and obtain the necessary permits. In addition, members of the National Parks Overflight Advisory Group (NPOAG) have been informed of this research. The NPOAG is comprised of a balanced group of representatives of general aviation, commercial air tour operators, environmental concerns, and Indian tribes.

A 60-day notice for public comments was published in the Federal Register on June 22, 2015, vol. 80, no. 119, page 35693. No comments were received.

  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

No payments or gifts will be provided.

  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

No assurance of confidentiality will be provided to respondents. The survey will not collect any personally identifiable information. Those who inquire about this issue will be told that reports prepared from this study will summarize findings so that responses will not be associated with any specific, identifiable individuals. Anonymity will be ensured, but confidentiality will not be pledged.

  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

The survey will not include any questions of a sensitive nature. In addition, respondents will be advised that their answers are voluntary.

  1. Provide estimates of the hour burden of the collection of information.

On the basis of the previous adaptations of these surveys conducted in several National Parks, we are confident that the proposed surveys will require no more than ten to fifteen minutes for each respondent to complete, regardless of the particular survey instrument being used. There is no preparation time (i.e., record keeping) required of the respondents, and for each respondent this will be a one-time event. We estimate the maximum total burden hours for this research to be 4,200. This is based on a maximum number of completed surveys of 16,800, at 15 minutes per survey.

  1. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

The cost burden on respondents and record-keepers, other than hour burden, is zero.

  1. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.

Estimates of staff labor, supplies and other expenses were based on previous dose-response research efforts, and the survey plan outlined in Supporting Statement B. The research effort will require the following staffing levels at each of the parks:

  • One Project Manager (from the Volpe Center)

  • One Survey Research Supervisor

  • Two Survey Researchers per study location, for a total of ten to eighteen researchers per park.

  • Four to six acousticians per park.

The following tables summarize the estimated study costs:



Table 2: Estimated Cost to Federal Government

  1. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.

There are no adjustments necessary to be made to burden time for the upcoming survey cycle. Updates to survey response rates for the previous cycle have been provided in the Supporting Statement Part B.

  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

The analysis will begin with standard social statistics for these types of data. Overall frequency distributions will be computed for each variable and perception measures will be analyzed for differences between categories of respondents based on the types of primary recreation activities in the park, as well as visitor demographics. Analyses to verify the reliability and internal validity of the survey instruments will be conducted. In addition, multivariate logistic regression will be employed to quantify relationships between the sound stimuli and human-response dimensions (annoyance, acceptability, interference with natural quiet). Also, social norm curves, as described above, will be employed to understand the average threshold annoyance levels. Additional information about the analytical techniques is provided in Supporting Statement Part B.

  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

We are not seeking such approval.

  1. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.

There are no exemptions to the certification statement.

References

i Anderson, G. S., Horonjeff, R. D., Menge, C., Miller, N. P., Robert, W. E., Rossano, C., et al. (1993). Dose-Response Relationships Derived from Data Collected at Grand Canyon, Haleakala and Hawaii olcanoes National Parks. NPOA Report No. 93-6. Lexington, MA: Harris Miller Miller & Hanson.

ii Fleming, G. G., Roof, C. J., Rapoza, A. S., Read, D. R., Webster, J. C., Liebman, P. D., et al. (1998). Development of Noise Dose / Visitor Response Relationships for the National Parks Overflight Rule: Bryce Canyon National Park Study. Federal Aviation Administration. Report No. FAA-AEE-98-01. Washington, D. C.: U.S. Department of Transportation. Available from: http://www.volpe.dot.gov/acoustics/pubs1.html

iii Rapoza, A. S. (2005). Study of Visitor Response to Air Tour and Other Aircraft Noise in National Parks. Report No. DTS-34-FA65-LR1. Cambridge, MA: U.S. Department of Transportation, Volpe National Transportation Systems Center.

iv Miller, N. P., Anderson, G. S., Horonjeff, R. D., Thompson, R. H., Baumgartner, R. M., & Rathbun, P. (1999). Mitigating the Effects of Military Aircraft Overflights on Recreational Users of Parks: Final Report. Burlington, MA: Harris Miller Miller & Hanson.

v Miller, N. P. (1999). The effects of aircraft overflights on visitors to U.S. National Parks. Noise Control Engineering Journal , 47 (3), 112-117.

vi Lawson, S. R., Hockett, K., Kiser, B., Reigner, N., Ingram, A., Howard, J., et al. (2007). Social Science Research to Inform Soundscape Management in Hawaii Volcanoes National Park: Final Report. Virginia Polytechnic Institute and State University, Department of Forestry, College of Natural Resources, Virgina. Available from: http://www.faa.gov/about/office_org/headquarters_offices/arc/programs/air_tour_management_plan/park_specific_plans/Hawaii_Volcanoes.cfm

vii U.S. Department of the Interior, National Park Service. (1997). The Visitor Experience and Resource Protection (VERP) Framework: A Handbook for Planners and Managers. Denver: Denver Service Center.

viii Anderson, G.S. Rapoza, A.S, Fleming G.G and Miller, N.P, (2011). Aircraft noise dose-response relationships for National Parks. Noise Control Engineering Journal. 59(5), 519-540.

ix Rapoza, A.S., Sudderth, E.A., and Lewis, K.L. (under review). The relationship between aircraft noise exposure and day-use visitor survey responses in backcountry areas of National Parks. Journal of the Acoustical Society of America.

x International Standards Organization. (2003). Acoustics -- Assessment of Noise Annoyance by Means of Social and Socio-Acoustic Surveys

xi Pilcher, E. J., Newman, P., & Manning, R. E. (2008). Understanding and Managing Experiential Aspets of Soundscapes at Muir Woods National Monument. Environmental Management

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for a New Collection RE: Aircraft Noise Impacts on Visitor Experience in National Parks
Authorhassol
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy