Supporting_Statement__Part_A_6-30-2011

Supporting_Statement__Part_A_6-30-2011.doc

Programmatic Review for NPS-Sponsored Public Surveys

OMB: 1024-0224

Document [doc]
Download: doc | pdf

Supporting Statement A

Programmatic Review for NPS-sponsored Public Surveys

OMB Control Number 1024-0224


Terms of Clearance: This programmatic collection of information is approved to facilitate the NPS Social Science Program. As such, the agency is reminded that this approval is restricted to seven topical areas specified in the Supporting Statement. This approval is not suitable for programmatic evaluation, significant policy evaluation, non-market good valuations using both contingent value and hypothetical behavior questions. In addition, this information collection is approved for 3 years to facilitate NPS’s development, for its Visitor Survey Program, of a “continuous improvement plan” that includes small scale tests embedded in future collections to address items recommended during consultations with a survey design expert, which includes:


check-all response format and double-response requests,

item nonresponse analysis to help determine specific problematic questions, and

cognitive testing on question ordering


NPS should report prior to its next submission the results of operational and other testing that will accompany plans move to a scanned questionnaire, as well as results from the specific testing items listed above. When NPS submits a survey under this collection, the agency must submit the survey instrument, and identify the questions in the survey instruments that are from the “Pool of Known Questions.” In addition, the agency must submit sufficient literature review, description of the methods, and description of any methodological research of which the survey will be a part (e.g., whether the methods are well-established or not). For any questions not substantially similar to those in the “Pool” or drawn from the literature, NPS should propose sufficient qualitative testing to demonstrate the questions prior to their use. The agency is reminded that it may not proceed with the survey unless it receives affirmative notice from OMB. OMB will strive to meet the expedited review schedule requested by the agency in this submission.


The Terms of Clearance have been met by: please see item 3 and the attached memo.




General Instructions


A completed Supporting Statement A must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified below. If an item is not applicable, provide a brief explanation. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," then a Supporting Statement B must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.


Specific Instructions



Justification

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.


The National Park Service requests extension/renewal of its Programmatic Clearance for NPS Sponsored Public Surveys so that we may better fulfill our program specific mission as well as our agency-wide responsibilities to provide excellence by proactively consulting with those we serve. Customer data are needed to:


  1. Meet requirements of the Government Performance and Results Act (GPRA)

  2. Realize the President’s Management Agenda (PMA)

  3. Meet requirements of Interior’s Citizen-Centered Customer Service Policy

  4. Meet requirements of Executive Order 12862

  5. Fulfill our responsibilities to provide excellence in government by proactively consulting with those we serve to identify opportunities to improve our information, services, and products to better meet their needs.


The National Park Service (NPS) preserves the nation’s natural and cultural heritage and provides for its enjoyment by citizens and visitors from throughout the world. An accurate understanding of the relationship between people and parks is critical to achieving the mission of the National Park System to protect resources unimpaired and providing for public enjoyment, education, and inspiration. Such understanding requires a sound scientific basis. Hence, social science research is a necessary and important function of the agency.


The NPS is required by the National Park Service Act of 1916 (39 Stat 535, 16 CSC1, et. Seq.) to preserve national parks for the use and enjoyment of present and future generations. At the park level, this means resource preservation, public education, facility maintenance and operation, and such physical development as is roughly in proportion to the seasonally adjusted volume of use (P.L. 88-578, Sect. 6) and in consideration of visitor characteristics and activities for determining park carrying capacity (92 Stat. 3467; P.L. 95-625, Sect. 604 11/10/78). Other federal rules (National Environmental Policy Act, 1969 and NPS Management Policies, 2006) require input from the public when assessing the impact of development on users, potential users, and residents near park as part of each park’s General Management Plan. These laws, policies, and regulations dictate periodic surveys of national park visitors, potential visitors, and residents of communities near parks.


NPS is requesting that OMB grant an extension of the existing programmatic approval (OMB# 1024-0224) and assign a new expiration date and burden hour budget to NPS. The scope of the programmatic review process remains unchanged and will continue to include individual surveys of park visitors, potential park visitors, and residents of communities near parks. Use of the programmatic review will be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process.

As part of its Social Science Program, the NPS uses the programmatic clearance process to manage the PRA process for information collections needed to provide park managers with data for improving the quality and utility of NPS programs. Many of the NPS surveys are similar in terms of the populations being surveyed, the types of questions being asked, and the research methods used. In 1998, the NPS and the Department of the Interior (DOI) proposed a pilot program of expedited approval for these NPS surveys. The program presented an alternative approach to complying with the Paperwork Reduction Act of 1995 (PRA). Clearance for the pilot program was granted by the Office of Management and Budget (OMB# 1024-0224). In 2001, the NPS Social Science Program requested an extension of the program. Clearance was granted on September 19, 2001 and further extensions were granted on January 31, 2005 and June 5, 2008.


The benefits of this program—referred to as the NPS programmatic approval—have been significant to NPS, DOI, OMB, NPS cooperators, and the public. In the 11 years of the programmatic approval, 483 individual surveys have been approved in support of NPS management and planning, providing the federal government and researchers a time and cost savings estimated to be at least $937,272 (see FY 2009 Annual Report).


Under the current NPS programmatic clearance process, the surveys are submitted to the NPS Information Collection Review Coordinator. The submissions are processed and reviewed for clarity and general adherence to the guidelines approved by OMB. Once the individual survey instruments are certified as a program of study for the purpose of overall clearance, they are submitted by NPS to OMB for final review and approval. Once the collection has received approval from OMB, the current OMB control number and expiration date (that does not exceed the expiration date for the programmatic approval) will be required to appear on each approved survey instrument and any accompanying documents. NPS will notify the investigator that the survey is approved. Additional monitoring will occur through annual reports submitted by NPS summarizing activity under the programmatic clearance for the previous fiscal year.


NPS continues to maintain an information base of public surveys conducted in parks to be used to increase the efficiency of future surveys. All approved survey instruments and final survey reports are archived with the NPS Social Science Program as part of the Social Science Studies Collection. This collection is available to researchers. Documents in the collection are currently available in digital format through the NPS website (http://npsfocus.nps.gov/). Hard copies are located at the Social Science Program office in Fort Collins, CO.


NPS will provide technical and administrative review of proposed surveys and communicate review comments to investigators. In appropriate cases, NPS may recommend that submitted proposals undergo review under the full PRA process, rather than the programmatic clearance process.


The programmatic clearance process will also apply to any pre-tests of surveys if they include similar questions asked of 10 or more persons. Requests for approval of pre-tests may be submitted with the survey approval request or submitted separately, as deemed appropriate by NPS and OMB.


If, after consultation with investigators, a proposed survey is recommended for approval by the Social Science Program, NPS will transmit to OMB the survey instrument and certification. OMB agrees to an expedited review within ten working days of receiving the submission package.





2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.


These data are being collected to improve the service and products that NPS provides to the public. Park managers and planners have used customer satisfaction and visitor use data to support all aspects of planning, from buildings, roads, and interpretive exhibits, to technical systems. In conducting their management, planning, and monitoring activities, managers also use the information to allocate their limited personnel and financial resources as effectively as possible to the highest priority elements.


This Programmatic Clearance process is limited to non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Examples of significant, sensitive, or political issues include: seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic National Wildlife Refuge.


All information collection instruments will be designed and deployed based upon acceptable statistical practices and sampling methodologies, and will be used to obtain consistent, valid data that are representative of the target populations and account for non-response bias, according to the most recent OMB guidance on “Agency Survey and Statistical Information Collections (January 20, 2006).”


Uses of the Information


Continuing the programmatic clearance process would benefit NPS management and the NPS Social Science Program in several ways:

  • Service needs of customers

  • Strengths and weaknesses of services

  • Ideas or suggestions for improvement of services from our customers

  • Barriers to achieving customer service standards

  • Changes to customer service standards

  • Baselines to measure change in improving service delivery over time

  • Improving public trust in government


The scientific community who partner with the NPS in administering surveys would benefit through:

1) a more efficient, effective, and timely management review process;

2) greater focus on peer review to improve the scientific quality of information collections;

3) increased attention to methodological improvements and use of best practices; and

4) better administration and wider sharing of information obtained from surveys of the public.


Allowable Information Collection Methods


On-Site and In-person intercept surveys:

Survey instruments are provided to respondents while on site to complete and then return it. This may include oral administration, paper forms, or the use of electronic technology and kiosks. The survey proctor is prepared to answer any questions the respondent may have about how to fill out the instrument but does not interfere or influence how the respondents answer the questions.


Mail and e-mail surveys:

Using existing lists of customer addresses, a three contact-approach based on Dillman's “Tailored Design Method” will be employed. The first contact will be a cover letter explaining that a survey is coming to them and its importance. The second contact will be the survey instrument along with a postage-paid addressed envelope to return the survey. The third contact will be a reminder postcard sent 10 days after the survey was sent. Finally, the respondents will receive a letter thanking them for the willingness to participate in the survey and reminding them to return it if they have not already done so. At each juncture, the respondents will be given multiple ways to contact someone with questions regarding the survey (including phone, web, or email). If the survey has been lost, the respondent can request that another be sent to them. Electronic mail is sometimes used instead of postal mail to communicate with customers. Although this is a cost-effective mode to survey a large group of people, it does not usually generate the best response rate. Telephone calls to non-respondents can be used to increase response rates.


Telephone interviews or questionnaires:

Using existing databases, an interviewer may a random selection process method to contact visitor who may have had specific experiences within a park. The interviewer will use an approved dial back method until the visitor has been reached. Once contacted, the survey respondent is given a brief introduction to the survey, including its importance and use. The interviewer will then expeditiously move through the survey questions. If the interviewer fails to make contact the approved process will be followed to reach the next contact.


Focus groups:

Some data and information are best collected through more subjective, conversational means. Focus groups can be small, or informal, group discussion designed to obtain in-depth qualitative information. Individuals are specifically invited to participate in the discussion, whether in person or through technologically enhanced means (e.g., video conferencing, on-line sessions). Participants are encouraged to talk with each other about their experiences, preferences, needs, observations, or perceptions. A moderator whose role is to foster interaction leads the conversation. The moderator will make sure that all participants are encouraged to contribute and that no individual dominates the conversation. Furthermore, the moderator will manage the discussion to make sure it does not stray too far from the topic of interest. Focus groups are most useful in an exploratory stage or when the bureau/office wants to develop a deeper understanding of a program or service. Using the best focus group research practices, groups will be constructed to include a cross-section of a given customer group.


Comment Cards:

Comment cards offer an excellent means to give the park managers feedback. A comment card should have a limited number of questions and an opportunity to comment. These comment cards provide managers and service providers with direct and specific information from their customers that could not be obtained through any other means. The intent of comment cards is to provide anecdotal feedback and not reported as a significant measure. Although questions may include numeric scales, the data are not intended to be statistically significant.

Types of Questions asked


There are seven topic areas that NPS programs can use to obtain voluntary information from their visitors. It is not expected that any one survey will cover all the topic areas; rather, these topic areas serve as a “guideline menu” from which the parks will develop their questionnaires. Under the information collection process, the NPS Social Science Program has developed questions that fit within the generally understood confines of each of the seven topic areas.


  1. Respondent Characteristics and Knowledge

    • socioeconomic and demographic characteristics (e.g., age, education, race and ethnicity, primary language, gender, residence, and income)

    • knowledge

    • visitation history

    • transportation uses

    • other individual and group characteristics

  2. Trip Planning

    • trip purpose

    • visit motives

  3. Trip Behaviors

    • activities future visits

    • itinerary

    • learning, and

    • questions to predict and explain activities and uses

  4. Preferences, Motives and Attitudes

    • place attachment

    • recreation preferences

    • soundscape perceptions

    • Theory of Planned Behavior

  5. Crowding and visitor experiences

    • crowding perceptions

    • visitor Experience and Resource Protection (VERP)

    • place attachment

    • importance-performance analysis

  6. Evaluations/opinions of Services, Facilities and Management;

    • evaluation of fees

    • evaluation of services

    • evaluation of transportation

    • opinions on Park Management

  7. Economic Impact and Benefit Analysis.

    • actual expenditures in parks and gateway regions

    • willingness to pay for park services


In the following sections, the need for each topic is discussed. Methods of inquiry commonly employed in questions pertinent to the topics are also presented. The NPS Social Science Program will encourage the use of recognized frameworks and question formats, while recognizing that there are different ways of answering common questions through social science research. Therefore, the Social Science Program will not prescribe methods that must be used, as long as alternative formats are demonstrated to be valid and reliable. This allows research to be responsive to the specific needs of parks. The information elicited by these questions support facility planning, interpretation and education, public risk management, and outreach to under-served populations. This information is also used by NPS partners to improve service provision to visitors. Partners of the NPS include concessioners, cooperating associations, and community-based organizations, such as local governments, convention and visitor bureaus, and chambers of commerce.


Topic Area 1 - Individual Characteristics and Knowledge


Individual characteristics are attributes of park visitors, potential visitors, and residents of communities near parks. Examples of questions included in this topic area are those asking respondents’ age, zip code (or country of residence), ethnicity, race, physical limitations, language preference and use, educational attainment, and household income.


Under the programmatic clearance process, socioeconomic and demographic questions will be limited to those that are germane and useful to fulfill the mission of the park or the NPS. Qualitative studies that do not generalize to specific populations should minimize the number of socioeconomic and demographic questions asked, unless they are intrinsic to the research topic or when it is necessary to document the diversity of a qualitative sample. In these cases, relevant demographic questions may be asked.


OMB provides guidance concerning demographic questions, such as race and ethnicity. For others, including age, gender, zip code, physical limitations, education, language preference, and household income, the Social Science Program recommends that investigators follow well-established formats for any questions not substantially similar to those in the “Pool” to be drawn from established methods identified in social science literature. In these cases the questions outside the “pool” will be identified and annotated to describe the purpose and intent of the variation. The Census short form or the American Community Survey demographic questions can be used as an option of comparing the individual characteristics of populations sampled in park surveys with regional or national populations.


In addition to socioeconomic and demographic variables, many measures of individual characteristics will be used to further describe visitors’ knowledge levels and previous experience. Examples are:


    • group size, group type (e.g., tour, alone, family, friends), and group composition (e.g., age, race, ethnicity);

    • frequency of visits (new vs. repeat or regular visitors);

    • prior knowledge of a park as a unit of the National Park System;

    • experience use history (e.g., past experience and skill levels in recreational activities available in a park);

    • knowledge of park programs and management issues.


The Social Science Program works with the NPS Visitor Services Project (VSP) to refine the questions that are regularly included in other surveys conducted in national parks, forests, and similar areas. The Social Science Program encourages investigators to use the questions in the Known pool of questions that have incorporated established VSP wording and measures when it is consistent with the purpose of their studies Other questions, such as those measuring experience use history, have been employed and validated in numerous recreation surveys (Hammitt et al., 2004).


Topic Area 2 - Trip Planning


Although not an exhaustive list, most questions in this topic area will include all aspects of travel that affect the trip decisions individuals make prior to or during their trip to the parks. These can include: 1) trip purpose, 2) visit motives and 3) information sources.


Questions about trip purpose help the NPS understand how parks are used by visitors. In some cases, a park visit is the primary and planned reason for a trip. In other cases, visits are incidental to business travel, visiting family or friends, attending a festival, event, or other attraction, or “passing through” on the way to another destination. Knowing if a park is the primary purpose of a trip is especially important in refining the estimation of the economic impacts of visitor spending in gateway regions.


The purpose of a trip may affect length of stay, information and facility needs, and activities participated in during a visit. Questions about trip purposes (i.e., is the park a primary destination, one of several destinations, etc.) are routinely included in NPS surveys. These items are also similar to questions used in general leisure travel studies, such as the TravelScope/DIRECTIONS survey of the Travel Industry Association.


Visit motives are the internal states or conditions that activate travel behavior and give it direction. Motives are sometimes referred to as “wants,” “desired experiences,” or “anticipated benefits” that energize action. The questions used to identify visit motives are fundamental to understanding why people do what they do in parks. Further, the discrepancy between trip motive and what is actually experienced can be a significant determinant of visitor enjoyment.


Parks have used data describing visitor motives to understand the types of experiences people expected or preferred in parks. For example, knowing whether or not visitors are seeking educational experiences can explain the use or non-use of interpretive services. Knowledge of visit motives has also been applied to reducing conflict between visitors in park settings by separating areas that provide different types of experiences.


Topic Area 3 – Trip Behaviors


Travel behavior refers to characteristics of current visits (and potential future visits) which affect trips to parks and nearby areas and communities. Examples include questions about activities, future visits, itineraries, overnight accommodations, trip information sources, transportation modes, fee payment, trip origins and destinations, length of trip, length of stay, and travel itineraries. Questions about itineraries, including routes and schedules, have been used in travel simulation models, which are useful in predicting vehicle and pedestrian traffic flow and managing visitor capacity in both front-country and backcountry areas of parks.


Understanding travel behavior is vital to many other park functions related to alternative transportation planning, fee structures, communications, community partnership-building, facility maintenance, and infrastructure development.


Although not an exhaustive list, many questions on individual activities and uses of park resources can be grouped into three representative sub-topics: 1) participation in recreational activities; 2) subsistence uses of park resources; and 3) questions providing information helpful in predicting or explaining activity participation and uses of park resources. Individual activities include but are not limited to: sightseeing, using visitor centers, day hiking, backpacking, picnicking, camping, shopping, observing wildlife, attending ranger-led programs, photography, boating,, fishing. The parks are also interested in know how individuals use a variety of park or related resources, including natural and cultural resources and park infrastructure and services. Among these are roads, trails, restrooms, parking lots, drinking water, viewpoints, gift shops, stores, and overnight accommodations. They may travel cross-country in roadless parts of a park or related area, tour historic structures or landscapes, or handle historic objects.


Depending on the site, subsistence surveys are used to typically target visitors or nearby residents who legally harvest plants, fish, game animals, fuelwood, and sea shells. Questions included in these surveys quantify a household’s use of park resources and contribute to an understanding of the factors affecting subsistence use in a changing social, economic, and environmental context. These factors include a household’s economic base (which affects the need for subsistence resources) and ties to kinship networks (the primary mechanism for distributing harvested resources and determining the principal harvesters in a network). The questions used in these surveys are critical for understanding the effects of subsistence use on parks, the effects of park policies and recreational visits on subsistence, and in developing subsistence management plans for parks.


Questions measuring participation in recreational activities during visits are basic to understanding human behavior in parks. Such information has helped managers better understand the impacts of visitors on natural and cultural resources, the efficacy of management actions to influence visitor behavior, and how visitors’ activities influence the experiences of other users. Studies that documented the activities people participated in, where they participated, and when they participated were critical for creating visitor-flow simulation models (e.g., along park trails and rivers). In addition, up-to-date knowledge of recreational activity participation helped park managers understand how much they contributed to enhancing physical well-being by providing opportunities for healthful recreation.


In some cases, an individual survey may focus on a single activity, such as backpacking, fitness walking, wildlife viewing, or sport fishing (e.g., “creel surveys”), and will ask respondents to provide detailed information about their participation in the specific activity. The in-depth information provided by such questions has been used to inform planned management of recreation activities in a park. In other cases, parks are interested in knowing the range of activities visitors engage in while in a park or gateway region. Typically, this has been measured by asking respondents to identify from a list of activities those that they or members of their group took part in. This information is helpful in identifying the resources



Topic Area 4 –Preferences, Motives and Attitudes


Questions predicting and explaining preferences, motives and attitudes are often used to understand and effectively manage the effects of activity participation on park resources and visitor experiences. Responses to sense of place or place attachment questions will offer resource managers a way to identify and respond to the emotional bonds people form with natural landscapes. Park managers are increasingly interested in understanding how to expand and apply the concepts of place attachment and sense of place to ecosystems management. The questions in this section will help managers build working relationships with the public that reflects an understanding of the management gap between social and biological sciences.


The construct of place attachment measures two dimensions: place identity and place dependence. Because place attachment is a complex construct, a larger number of items is required in order to ensure internal reliability and validity. Many recent publications outline well-tested measures, including format, layout, and number of items (Kyle, Absher, & Graefe, 2003; Kyle et al., 2003; Williams & Vaske, 2003).


Measures of individuals’ place attachment are used to understand why people visit specific areas in or near a park and what their expectations are when visiting. Place attachment has been shown to predict environmentally responsible behaviors (Vaske & Kobrin, 2001) and is important when considering public response to certain management actions, including the response of residents living near parks.


With the current growth in participation in outdoor recreation, it is important for resource managers to learn about the needs and trip motives and preferences of recreational users, and to act accordingly to optimise the quality of their experiences. Many new plans must consider the quality of recreational opportunities, recreational preferences, suitability of land for recreation, provisions for inventorying recreational assets, and recreational monitoring that informs decision making. The questions in this section will help managers understand the difference in recreational satisfaction and activity preferences. The variables have implications beyond the superficial spatial separation of origin and recreational destination and can be used to guide choices between natural environmental protection and facility development in recreational resource management.


An area of growing concern has been the increasing presence of human-caused sounds in national parks, including sounds from road vehicles, aircraft, and construction and mining equipment, generators, landscaping equipment, etc. NPS is developing sound management policies at a number of national parks. The questions in this section are used to examine the effects of natural and human –caused sound have on national park visitation and explore the affect these conditions have on the attitude of the visitors.


Although many social psychological frameworks can be applied to predicting and explaining attitudes and behaviors the Theory of Planned Behavior (TpB) is the most widely used throughout the NPS and has been applied to studies of the relations among beliefs, attitudes, behavioral intentions and behaviors. The questions in this section are used to determine if certain attitudes toward a behavior are influenced by beliefs about the outcomes of engaging in that behavior.


A schematic of the TpB is represented in Figure 1.



The TpB includes applications used to understand public attitudes regarding wildfires in national parks (Bright et al., 1993; Manfredo et al., 1990), power boaters’ compliance with posted speed limits (Aipanjiguly, Jacobson, & Flamm, 2003), understanding camping behavior (Young & Kent, 1985), examination of hunting intentions (Hrubes, Ajzen, & Daigle, 2001), and compliance with leash laws (Nesbitt, 2006). The TpB also has been recognized as a theory suitable for application to understanding the efficacy of visitor education in natural areas (Marion & Reid, 2001). In the NPS, it has been applied to understanding backpackers’ compliance with Leave-No-Trace principles, visitors’ likelihood of using alternative transportation, and visitors’ response to different interpretive messages.

Topic Area 5 - Crowding and Visitor Experiences


Understanding visitors’ impacts on both resources and experiences, together with how they perceive these impacts, is paramount to the management of NPS units. Perceptions of crowding, use limits, and social and environmental impacts have been the topic of social science research for decades and are an important contribution to a park’s VERP planning process. The 1978 General Authorities Act (PL 95-625) requires each park’s General Management Plan to include identification of and implementation commitments for visitor carrying capacities for all areas of a unit. NPS managers need to understand the carrying capacity challenges they face them and methods available to measure visitors’ perceptions of crowding. Under the NPS programmatic clearance process, questions about individual perceptions are limited to issues that the park or the NPS can manage, independently or in concert with partners. Three approaches to measure crowding and individual perception of park experiences are used in this section: 1) perceptions, 2) attitudes, and 3) Visitor Experience and Resource Protection (VERP).



In the realm of social science, VERP-related research has focused largely on measuring visitors’ standards for minimally acceptable impacts. In a typical application respondents are asked about acceptable limits for encounters, management actions and use levels. This is typically done by having visitors respond to different levels of crowding displayed in a series of manipulated photographic images (Manning et al., 2001). Visitors are also asked to respond to a photograph that represents the typical number of visitors they saw during their most recent stay. Reviews of similar studies (Dowert et al., 2004; Manning et al., 1996) have determined that visitors are capable of perceiving various impacts on resources and experiences, and that they discriminate between their personal preferences for a condition (usually an exacting standard) and the level at which parks should take management actions to keep indicators within standards (usually a more tolerant standard).


VERP-related research has also been used to gather public input on desired future conditions on indicators of quality experiences, to determine if standards for indicators are being exceeded. Visitor surveys also help parks determine which social and biophysical factors influence visitor behavior, the amount, type, timing, and location of visitor use, and indicators of resource conditions, such as erosion, soil compaction, and vegetation health. This research suggests the on-the-ground actions that might be taken to maintain visitor-experience indicators within acceptable standards.


To date, VERP research has focused primarily on crowding, but the approach has been usefully applied to other social and natural resource contexts, including ecological impacts on campsites and trails (Shelby et al., 1988; Manning et al., 1996), minimum stream flows (Shelby & Whittaker, 1995), and soundscape management (Newman, 2006).


Topic Area 6 – Evaluation/Opinions of Services and Facilities


Questions concerning individual evaluations of park services are central to the “visitor enjoyment” component of the NPS mission. Answers to these questions assist managers and planners in determining if park services are meeting visitors’ needs. Ratings of services and facilities provided by the NPS, concessioners, cooperating associations, and other partners in parks and gateway communities are included in this topic area. Typical services and facilities evaluated include exhibits, visitor centers, signage, restrooms, concession facilities, brochures, campgrounds, shuttle or tram systems, interpretive programs and tours, and park websites.


The Social Science Program anticipates that many questions in this category will be based on an “importance-performance analysis” (IPA) framework (Martilla & James, 1977). The IPA method was developed originally in marketing and is widely used in quality management. This includes applications to national park services (Tong & Moore, 2006).


In a national park context, questions included under this topic include importance and quality/satisfaction ratings of services which individuals used, or could have used, during a visit to a park or nearby area. In the most common format, ratings of importance (“not important” to “extremely important”) and ratings of quality (“very poor” to “very good”) or ratings of satisfaction (“terrible” to “delighted” or “very unsatisfied” to “very satisfied”) are obtained for services used by visitors.1 Responses are usually analyzed to determine if gaps exist between services that are important to visitors and their perceived quality/satisfaction ratings for these services. The optimal situation exists when services important to visitors are also rated high in performance. In contrast, low performance ratings for important services or high performance ratings for unimportant services indicate a need to for park management to re-allocate effort and resources.


Individual opinions on park management include the ideas, beliefs, attitudes, preferences, and values that visitors, potential visitors, and residents of communities near parks express regarding all aspects of NPS park management. Included in the scope of this topic are individual opinions about how parks manage natural and cultural resources, maintain physical structures, interact with community partners, guide human uses of park resources and facilities, and provide educational and other services to visitors, potential visitors, and residents of communities near parks. Also included are questions measuring the trust that visitors or nearby residents have in the NPS. Information produced by these questions is fundamental to successful civic engagement and other community outreach activities of individual parks and the NPS.


Residents of communities near parks actively engage the NPS not only as visitors, but in other ways. As discussed previously, in Alaska some residents live in designated subsistence communities within or adjacent to national parklands. Others are park concession employees, in-holders, or employees of cooperating associations. Some serve as partners in community-based organizations involved in natural and historical conservation or preservation. These latter organizations comprise areas that are closely affiliated with the National Park System, including National Heritage Areas, National Scenic Trails, National Heritage Corridors, and similar places designated by Congress. Questions asking these residents’ opinions about NPS management of parks and its interaction with related areas inform the creation and maintenance of productive partnerships between individual parks, nearby communities, and affiliated areas.


Topic Area 7 – Economic Impacts and Benefits Analysis


Although not an exhaustive list, many questions falling under this topic area can be divided into two sub-topics: 1) individual expenditures in time and/or dollars that occur when visiting parks and surrounding areas; and 2) expenditures in time and/or dollars that people would be willing to incur during future visits to a park or surrounding area.


Expenditures in parks or gateway regions are typically reported for an individual or group in standard industry categories, including lodging, food, transportation, and other goods and services.


By prior agreement with OMB willingness-to-pay (WTP) questions included in surveys conducted under the NPS programmatic clearance process are limited to goods and services currently or potentially provided by the NPS, its cooperating associations, concessioners, and other partners. Questions concerning willingness to pay for non-market goods and services, such as clean air and water, are excluded from this program of studies.


Examples of WTP questions that might appear in surveys conducted under the programmatic clearance process include willingness to pay for park entrance fees, shuttle service, and other items that are relevant to the mission, management, and/or operations of the NPS. When reviewing such questions, the Social Science Program will work with investigators to ensure that items include enough context, such as the payment vehicle, to allow respondents to make an informed decision. Questions about fees should clearly describe what the fees are to be used for so that respondents can judge for themselves whether or not the fee is appropriate.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.


Individual surveys conducted under the programmatic clearance process will vary in the methods used to contact the public. At least 70% of the surveys conducted under this program will be conducted on-site and will be mailed back or returned on-site. Another 20% of the surveys will offer electronic response options (e.g. Survey Monkey™ or Key Survey™). The remaining 10% will be collected by way of face to face or telephone interviews or small focus groups. In all cases, appropriate non-response bias strategies will be used to ensure that responses are representative of the contact universe.


Surveys administered by the VSP constitute at least 50% of the information collection requests reviewed by the programmatic clearance process. This averages 5,000 annual responses. The VSP has moved to a scanned questionnaire format as the primary mode to collect information. The remaining non-VSP submission request the use of traditional data collection methods, including paper surveys, comment cards, and electronic options.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


This effort does not duplicate any other survey being done by Federal agencies. Other Federal agencies are conducting user surveys but are not soliciting comments on the delivery of NPS products and services. Any possible duplication will be examined during the NPS Social Science Program’s technical and administrative review of individual proposals. Historically, surveys conducted under the programmatic clearance process provide park-specific information or they have supplied regional or Systemwide information that meets a specific and timely need of the NPS. Other national recreation surveys, such as the National Survey of Hunting, Fishing, and Wildlife-associated Recreation conducted by the Fish and Wildlife Service and the National Survey on Recreation and the Environment carried out by the Forest Service, provide information on the outdoor recreation participation patterns of a national sample of households, but are not on-site visitor studies and do not cover the types of management and planning issues that are of central concern to individual units of the National Park System. The NPS Comprehensive Survey of the American Public is a “big-picture” national household survey that does not collect the finely detailed data that satisfies local information needs, including information on visitors to individual parks. The Forest Service’s “National Visitor Use Monitoring Program” conducts annual on-site visitor studies on national forests so they do not provide the information that is typically needed to support decision-making in the National Park System. Finally, travel industry surveys periodically examine broad trends in leisure travel, but tend to use self-selected groups of respondents recruited to serve on national Web panels. Such surveys do not represent park visitors, potential visitors, or residents of communities near parks and often are limited to Internet users who are usually offered significant incentives for participation.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The primary purpose of this effort is to gather information needed without putting a significant additional burden on small entities. Sampling will be used and the number of questions on the surveys will be kept to a minimum. Use of electronic means of surveying also has the potential to reduce the burden on small entities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Up-to-date data on park visitors, potential visitors, and residents of communities near parks are not available for many units of the National Park System, except through the types of information collections proposed under this program. Without this information collection, NPS will not be able to determine the kind and quality of service visitors and users want, their level of satisfaction, or ways to improve customer service in a timely manner. This Programmatic Clearance for NPS Sponsored Surveys enables NPS to collect data in an expedited manner, which not only helps us obtain data from visitors more quickly, but enables us to respond to concerns in a manner that better meets their expectations.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


This renewal request contains no special circumstances with respect to 5 CFR 1320.5 (2) (i) and (iii-viii) with the exception of (ii). We may be asking respondents to send back their responses in fewer than 30 days after receipt of the survey. On these types of surveys, respondents normally will respond rather quickly if they intend to respond at all. These are voluntary surveys and respondents are not obligated to respond.






8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


On February 8, 2011 a 60-day Federal Register notice (76 FR 6818) was published announcing this information collection. Public comments were solicited for 60 days ending April 11, 2011. We did not receive any public comments in response to that notice. In addition, individuals who had served as principal investigators on NPS-sponsored public surveys in FY 2009 and FY 2010 were informed that the 60-day Federal Register notice had been published. The NPS did not receive any public comments as a result of the publication of this 60-Day Federal Register notice and the subsequent notice to investigators.

After a review of the Pool of Known Questions by the social science researchers at the University of Idaho, several questions were removed because they were outdated or underused. The review also included adding a new topic area to evaluate economic benefits. The remaining six topic areas were updated based on consultations with outside peer reviewers.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Incentives, remuneration, and gifts are generally deemed inappropriate as part of plans for information collections conducted within the scope of the Programmatic Clearance for NPS Sponsored Surveys. In some cases, the provision of gifts and incentives to respondents may appear to be a conflict of interest. However, there may be extraordinary circumstances under which remuneration may be appropriate within the scope of this program. In the cases of information collections that seek to use incentives, program managers must describe the proposed incentive, how it will be offered to respondents, and provide a justification of its use within the supporting statement, which is required as part of each information collection request under this package.






10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurance of confidentiality will be provided to respondents to the surveys conducted under this program. The Department of the Interior does not have the statutory authority to protect confidentiality or to exempt a survey from a request under the Freedom of Information Act. Instead, those who inquire about this issue will be told that their answers will be used only for statistical purposes. They will also be told that reports prepared from this study will summarize findings across individual samples so that responses will not be associated with any specific individuals. Respondents will be informed further that the NPS and its research partners will not provide information that identifies respondents, except as required by law. Indeed, personally identifying information (telephone numbers, e-mail addresses, and postal addresses) is typically stripped from data files before the files are made available to parks or to others parties. Therefore, the administration of surveys conducted under this program is essentially anonymous.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The questions used on these surveys will not be of a sensitive nature.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


Based on experience with the existing Programmatic Clearance, we estimate that there will be approximately 57,500 annual respondents. Please see the table below for the estimated annual respondent burden for this collection. Given these estimates, NPS anticipates a budget of 19,350 hours per year for these proposed collections.


We estimate the total dollar value of the annual burden hours for this collection to be $581,855 (rounded). We arrived at this figure by multiplying the estimated burden hours by $30.07 valuation of volunteer time and the projected burden hours, an approximate aggregate annual cost to This wage figure included the multiplier for benefits and is based on the National Compensation Survey: Occupational Wages in the United States published by the Bureau of Labor Statistics (BLS) Occupation and Wages for average full compensation for private industry, (hour costs based on BLS news release USDL-11-849 for Employer Costs for Employee Compensation—June 8, 2011 at - http://www.bls.gov/news.release/pdf/ecec.pdf), dated March 2011).




Responses

Completion Time

Burden Hours

On-site/Mail-back surveys

50,000

18 minutes

15,000

Telephone Surveys

2,500

30 minutes

1,250

Focus Groups/In person interviews

3,000

60 minutes

3,000

Comments cards

2,000

3 minutes

100

Total

57,500


19,350


Of this burden, 6,297 hours are committed for the 13 ICs that were already approved and are currently in progress, which are included in this ICR.


13. Provide an estimate of the total annual non-hour cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


We have identified no reporting and recordkeeping “non-hour cost” burdens associated with this proposed collection of information.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


Each survey or collection will be developed and designed on a case-by-case basis. The program staff will determine if it is more efficient and cost effective to develop, distribute, collect, and analyze these surveys in-house or to turn to private or other non-government entities to provide that service. However, in order to provide a reasoned estimate, we have assumed a ratio of three hours of Federal work for each hour of survey time, with the Federal work at an average pay level of GS-12 step 5, which is $37.73 per hour based on OPM Salary Table 2011-GS. Using a multiplier of 1.5 [as implied by the previously referenced BLS news release] to add benefits, the full compensation cost would be $56.59 per hour. Thus, the estimated annual cost to the Federal government is 19,350 hours times 3 times (58,050) times $56.59 per hour, which is $3,285,050

15. Explain the reasons for any program changes or adjustments in hour or cost burden.


The hour burden request has increased due to the number of focus groups and in-person interviews added and the increase in the number of collections processed and approved each year since 2008.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Each information collection considered under the Programmatic Clearance will describe how the data will be used. Each information collection will provide to OMB the specific tabulation methods used to synthesize, analyze and aggregate data collected.


Aggregated information will be published annually, submitted to the social science programs, and will be made available on –line as a part of the NPS Focus digital library. Some of results will be presented in internal technical reports, at conferences, and in peer-reviewed literature. Copies of technical reports will continued to be archived with the NPS Social Science Program and entered online into the Social Science Studies Collection


Most analyses of surveys conducted under this program will involve simple tabulations to address concrete management and planning issues. These include response frequencies, means, standard deviations, confidence intervals, and breakdowns of these by important sub-groups of respondents. When collected, expenditure data may be input into the NPS Money Generation Model which will be used to estimate the economic impact of visitor spending on gateway regions. In some cases, more complex multivariate statistical analyses are performed, as when estimating coefficients for models based on the Theory of Planned Behavior. In other cases, data from qualitative studies may involve transcriptions of interviews or focus group discussions, followed by content analyses identifying general themes.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


We will display OMB’s expiration date on the information collection instruments.


18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."


We are requesting no exceptions to the certification statement.


1 A consensus has yet to emerge on whether performance measured by quality ratings or performance measured by satisfaction ratings is the best approach to IPA. The Social Science Program will accept either, since both are reported in the peer-reviewed literature.

20


File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submissions
AuthorBrian Forist
Last Modified ByDepartment Of The Interior
File Modified2011-06-30
File Created2011-06-30

© 2024 OMB.report | Privacy Policy