Supporting_Statement_PartA_FINAL

Supporting_Statement_PartA_FINAL.doc

Programmatic Approval for National Park Service-Sponsored Public Surveys

OMB: 1024-0224

Document [doc]
Download: doc | pdf

Supporting Statement for Programmatic Clearance for NPS-sponsored Public Surveys


OMB Control Number 1024-0224 (renewal)


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

The National Park Service (NPS) preserves the nation’s natural and cultural heritage and provides for its enjoyment by citizens and visitors from throughout the world. An accurate understanding of the relationship between people and parks is critical to achieving the mission of the National Park System to protect resources unimpaired and providing for public enjoyment, education, and inspiration. Such understanding requires a sound scientific basis. Hence, social science research is a necessary and important function of the agency.


The NPS is required by the National Park Service Act of 1916 (39 Stat 535, 16 CSC1, et. Seq.) to preserve national parks for the use and enjoyment of present and future generations. At the park level, this means resource preservation, public education, facility maintenance and operation, and such physical development as is roughly in proportion to the seasonally adjusted volume of use (P.L. 88-578, Sect. 6) and in consideration of visitor characteristics and activities for determining park carrying capacity (92 Stat. 3467; P.L. 95-625, Sect. 604 11/10/78). Other federal rules (National Environmental Policy Act, 1969 and NPS Management Policies, 2006) require input from the public when assessing the impact of development on users, potential users, and residents near park as part of each park’s General Management Plan. These laws, policies, and regulations dictate periodic surveys of national park visitors, potential visitors, and residents of communities near parks.


As part of its Social Science Program, the NPS sponsors surveys of the public to provide park managers with information for improving the quality and utility of NPS programs. Many of the NPS surveys are similar in terms of the populations being surveyed, the types of questions being asked, and the research methods used. In 1998, the NPS and the Department of the Interior (DOI) proposed a pilot program of expedited approval for these NPS surveys. The program presented an alternative approach to complying with the Paperwork Reduction Act of 1995 (PRA). Clearance for the pilot program was granted by the Office of Management and Budget (OMB# 1024-0224 exp. 8/31/2001). In 2001, the NPS Social Science Program requested an extension of the program. Clearance was granted on September 19, 2001 (OMB# 1024-0224 exp. 9/30/2004). A second extension was granted on January 31, 2005 (OMB# 1024-0224 exp. 1/31/2008).


The benefits of this program—referred to as the NPS programmatic approval—have been significant to NPS, DOI, NPS cooperators, and the public. In the eight years of the programmatic approval, 371 individual surveys have been approved in support of NPS management and planning, providing the federal government and researchers a time and cost savings estimated to be at least $723,087 (see Attachment C: FY 2006 Annual Report).


Under the NPS programmatic approval, an alternative set of practices and procedures is employed by which OMB determines whether or not to approve proposed surveys of park visitors, potential park visitors, and/or residents of communities near parks. All questions asked of ten or more members of the public must fall within the scope of seven topic areas. The question topics included in this request have not changed from the previous approval. (See Part A, #2 for more information regarding topic areas and definitions.).


OMB reviews NPS procedures for these surveys as a program of study for the purpose of overall clearance. OMB also reviews each individual survey instrument certified and submitted by NPS as part of the program. NPS, with DOI and OMB monitoring, conducts the necessary quality control through peer review of appropriate program elements. NPS also maintains an information base of public surveys conducted in parks to be used to increase the efficiency of future surveys. All approved survey instruments and final survey reports are archived with the NPS Social Science Program as part of the Social Science Studies Collection. This collection is available to researchers. Documents in the collection are currently available in digital format through the NPS website (http://npsfocus.nps.gov/). Hard copies are located at the Social Science Program office in Denver, CO.


NPS requests that OMB grant an extension of the existing programmatic approval (OMB# 1024-0224) and assign a new expiration date and burden hour budget to NPS. The scope of the programmatic approval remains unchanged and includes individual surveys of park visitors, potential park visitors, and residents of communities near parks. The burden hour budget also is unchanged. Use of the programmatic approval will be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process.


NPS will continue to provide technical and administrative review of proposed surveys and communicate review comments to investigators. In some cases, NPS may recommend that submitted proposals undergo review under the full PRA process, rather than the programmatic approval.


The programmatic approval applies to pre-tests of surveys if they include similar questions asked of 10 or more persons. Requests for approval of pre-tests may be submitted with the survey approval request or submitted separately, as deemed appropriate by NPS and OMB.


If, after consultation with investigators, a proposed survey is recommended for approval by the Social Science Program, NPS will transmit to OMB the survey instrument and certification. OMB agrees to an expedited review within ten working days of receiving the submission package. Once it has received approval from OMB, NPS will assign the OMB control number, an NPS tracking number, an expiration date that does not exceed the expiration date for the programmatic approval, and an appropriate number of burden hours to the survey. Typically, the expiration date assigned is six months from the projected end date of the survey. NPS will notify the investigator that the survey is approved. Additional monitoring will occur through annual reports submitted by NPS to DOI and OMB summarizing activity under the programmatic approval for the previous fiscal year.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]

Uses of the Information

Renewal of the programmatic approval would benefit NPS management and the NPS survey process in several ways:

  • by ensuring timely acquisition of site-specific social science information for park managers to use in park planning, designing facilities, and developing interpretive programs that meet visitor needs;

  • by increasing the efficiency of the use of NPS personnel and funding in acquiring information from various publics;

  • by improving the timeliness of NPS receipt of effective public and peer comments;

  • by improving managers’ and planners’ access to valid and usable scientific knowledge, while minimizing the burden on the public.


The DOI would benefit through being able to place greater emphasis on achieving their review and oversight functions. Further, the public would benefit by: 1) being provided with a more effective, synoptic, and less burdensome process for commenting on NPS-proposed information collections; 2) more efficient expenditures of public funds to develop and approve surveys; and 3) more efficient use of burden hours.


Finally, the scientific community who partner with the NPS in administering surveys would benefit through: 1) a more efficient, effective, and timely management review and approval process; 2) greater focus on peer review to improve the scientific quality of information collections; 3) increased attention to methodological improvements and use of best practices; and 4) better administration and wider sharing of information obtained from surveys of the public.


Through FY07, 371 surveys have been conducted under the NPS programmatic approval. These surveys have been integral to park planning and management. They provide information to park managers about visitors and other important publics that would otherwise be more costly and time-consuming to collect. Parks are able to use this information in natural and cultural resource management, facility planning, interpretive programming, community partnership-building, and other critical functions. The information provided through these surveys is timely, unique, and invaluable to NPS in fulfilling its mission of preserving national parks for the use and enjoyment of present and future generations.


Question Topic Justifications

This section discusses the topics and sub-topics addressed in surveys submitted under the NPS programmatic approval. All questions in proposed surveys must fit in seven general topic areas. For this program of study, the NPS believes strongly that it is important to focus primarily on the general topic areas referenced below, rather than on specific question wording. The questions need to be flexible enough to work in the wide variety of parks that require information collections.1 The topics and their definitions are described in the NPS publication, “Guidelines and Submission Form for NPS-sponsored Public Surveys, Focus Groups, and Field Experiments.” This publication was revised by NPS in June 2006. It is published online at:


http://www1.nature.nps.gov/socialscience/pdf/Expedited_Guidelines_06-06.pdf.

Seven topic areas are included in the programmatic approval. Along with representative sub-topics, these include:

  1. Individual characteristics;

    • socioeconomic and demographic characteristics

    • other individual and group characteristics

  2. Trip/visit characteristics;

    • travel behavior

    • trip purpose

    • visit motives

  3. Activities and uses of park resources;

    • activity participation

    • subsistence uses of park resources

    • questions to predict and explain activities and uses

  4. Expenditures;

    • actual expenditures in parks and gateway regions

    • willingness to pay for park services

  5. Evaluations of park services;

    • importance-performance analysis

  6. Perceptions of park experiences;

    • crowding perceptions

    • Visitor Experience and Resource Protection (VERP)

    • place attachment

  7. Opinions on park management.


In the following sections, the need for each topic (along with representative sub-topics) is discussed. Methods of inquiry commonly employed in questions pertinent to the topics are also presented. The NPS Social Science Program encourages the use of proven frameworks and question formats, while recognizing that there are different ways of answering common questions through social science research. Therefore, the Social Science Program does not prescribe methods that must be used, as long as alternative formats are demonstrated to be valid and reliable. This allows research to be responsive to the specific needs of parks.


Attachment D contains a list of the pool of known questions the Social Science Program can reasonably anticipate being included in upcoming surveys. This is by no means an exhaustive list of all questions that will be submitted under the seven general categories included in the program of studies, but rather is a collection of questions taken from various survey instruments that have been submitted previously and approved by OMB. The measurement frameworks from which these known questions are derived are referenced below under the corresponding topic areas. In addition, a large percentage of the known questions come from past NPS Visitor Services Project surveys, which are also described in more detail below.


Topic Area 1 - Individual Characteristics

Individual characteristics are attributes of park visitors, potential visitors, and residents of communities near parks. Descriptions of these populations are central to the mission of the NPS because effectively promoting resource protection, visitor enjoyment and education, outreach to under-served populations, and cooperation with local communities requires basic descriptive information on who visitors, potential visitors, and nearby residents are.


Although not an exhaustive list, most questions falling under this topic area can be divided into two sub-topics: 1) socioeconomic and demographic measures, and 2) other measures describing individuals and groups that inform specific park planning and management activities.

Socioeconomic and demographic characteristics. Examples of questions included in this sub-topic area are those asking respondents’ age, zip code (or country of residence), ethnicity, race, disabilities, language preference and use, educational attainment, and household income. Under the programmatic approval, socioeconomic and demographic questions are limited to those that are germane to the topic being studied and are useful to the park or the NPS. Qualitative studies that do not generalize to specific populations should minimize the number of socioeconomic and demographic questions asked, unless there are compelling reasons for including them. Examples of such reasons are when specific characteristics (e.g., race or ethnicity) are intrinsic to the research topic (e.g., focus groups about interpretation of slavery), or when it is necessary to document the diversity of a qualitative sample. In these cases, a limited number of relevant demographic questions may be asked.


OMB publishes guidance for some demographic questions, such as race and ethnicity. For others, including age, gender, zip code, disabilities, education, language preference, and household income, the Social Science Program recommends that investigators follow formats used in the Census short form or the American Community Survey, unless there is reason to do otherwise. This maintains the option of comparing the individual characteristics of populations sampled in park surveys with regional or national populations.


Other individual or group characteristics. In addition to socioeconomic and demographic variables, many measures of individual characteristics are specific to surveys of recreational visitors. The information elicited by these questions further describes visitors, including their knowledge levels and previous experience. Examples are:

  • group size, group type (e.g., tour, alone, family, friends), and group composition (e.g., age, race, ethnicity);

  • frequency of visits (new vs. repeat or regular visitors);

  • prior knowledge of a park as a unit of the National Park System;

  • experience use history (e.g., past experience and skill levels in recreational activities available in a park);

  • knowledge of park programs and management issues.


Many of these questions have been refined over time by the NPS Visitor Services Project (VSP) and are regularly included in other on-site surveys conducted in national parks, forests, and similar areas. The Social Science Program encourages investigators to use established VSP wording for these measures when it is consistent with the purpose of their studies2. Other questions, such as those measuring experience use history, have been employed and validated in numerous recreation surveys (Hammitt et al., 2004).


The information elicited by these questions supports facility planning, interpretation and education, public risk management, and outreach to under-served populations. This information is also used by NPS partners to improve service provision to visitors. Partners of the NPS include concessioners, cooperating associations, and community-based organizations, such as local governments, convention and visitor bureaus, and chambers of commerce.


Topic Area 2 - Trip/Visit Characteristics

Although not an exhaustive list, most questions falling under this topic can be divided into three broad categories: 1) travel behavior, 2) trip purpose, and 3) visit motives.


Travel behavior. Travel behavior refers to characteristics of current visits (and potential future visits) which affect trips to parks and nearby areas and communities. Examples include questions about overnight accommodations, trip information sources, transportation modes, fee payment, trip origins and destinations, length of trip, length of stay, and travel itineraries. Information sources used in trip planning also are included in this category. Questions about itineraries, including routes and schedules, can be input into travel simulation models, which are useful in predicting vehicle and pedestrian traffic flow and managing visitor capacity in both frontcountry and backcountry areas of parks.


Understanding travel behavior is vital to many other park functions related to alternative transportation planning, fee structures, communications, community partnership-building, facility maintenance, and infrastructure development. When it is consistent with the purpose of a study, the NPS Social Science Program will rely on the updated VSP questionnaire to provide standard wording for travel-behavior questions.


Trip purpose. Questions about trip purpose help the NPS understand how parks are used by visitors. In some cases, a park visit is the primary and planned reason for a trip. In other cases, visits are incidental to business travel, visiting family or friends, attending a festival, event, or other attraction, or “passing through” on the way to another destination. Knowing if a park is the primary purpose of a trip is especially important in refining the estimation of the economic impacts of visitor spending in gateway regions.


The purpose of a trip affects length of stay, information and facility needs, and activities participated in during a visit. Questions about trip purposes (i.e., is the park a primary destination, one of several destinations, etc.) are routinely included in VSP surveys. These items are similar to questions used in general leisure travel studies, such as the TravelScope/DIRECTIONS survey of the Travel Industry Association.


Visit motives. Visit motives are the internal states or conditions that activate travel behavior and give it direction. Motives are sometimes referred to as “wants,” “desired experiences,” or “anticipated benefits” that energize action. The study of visit motives is fundamental to understanding why people do what they do in parks. Further, the discrepancy between trip motive and what is actually experienced can be a significant determinant of visitor enjoyment.


The Recreation Experience Preference (REP) scales developed and validated by the US Forest Service are a widely employed tool for measuring visit motives. An example of questions using REP scale items is shown in Attachment D. The full pool of REP scale items is included in Attachment E. The application of the REP scales has an extensive history in recreation research (Brown et al., 1978; Driver, 1976). In 1996, Manfredo et al. published a meta-analysis of 36 studies in which the REP scales were used. Items were analyzed using structural equation modeling to test for dimensionality within broader experience constructs.


Parks use data describing visitor motives to understand the types of experiences people expect or prefer in parks. For example, knowing whether or not visitors are seeking educational experiences can explain the use or non-use of interpretive services. Knowledge of visit motives has also been applied to reducing conflict between visitors in park settings by separating areas that provide different types of experiences.


Topic Area 3 - Individual Activities and Uses of Park Resources

Individuals participate in many activities during their visits to parks, related areas, and nearby communities. Examples include sightseeing, using visitor centers, day hiking, backpacking, picnicking, camping, shopping, observing wildlife, attending ranger-led programs, photography, boating, and fishing. Individuals also use a variety of park or related resources, including natural and cultural resources and park infrastructure and services. Among these are roads, trails, restrooms, parking lots, drinking water, viewpoints, gift shops, stores, and overnight accommodations. Depending on the site, visitors or nearby residents may legally harvest plants, fish, game animals, fuelwood, and sea shells. They may travel cross-country in roadless parts of a park or related area, tour historic structures or landscapes, or handle historic objects.


Although not an exhaustive list, many questions on individual activities and uses of park resources can be grouped into three representative sub-topics: 1) participation in recreational activities; 2) subsistence uses of park resources; and 3) questions providing information helpful in predicting or explaining activity participation and uses of park resources.


Recreational activity participation. Questions measuring people’s participation in recreational activities during visits are basic to understanding human behavior in parks. Such information helps managers understand the impacts of visitors on natural and cultural resources, the efficacy of management actions to influence visitor behavior, and how visitors’ activities influence the experiences of other users. Studies documenting what activities people participate in, where they participate, and when they participate are critical for creating visitor-flow simulation models (e.g., along park trails and rivers). In addition, knowledge of recreational activity participation helps park managers understand how much they contribute to enhancing physical well-being by providing opportunities for healthful recreation.


In some cases, a survey may focus on a single activity, such as backpacking, fitness walking, wildlife viewing, or sport fishing (e.g., “creel surveys”), and ask respondents detailed questions about their participation in the activity. The in-depth information provided by such questions informs current and/or planned management of the activity in a park.


In other cases, parks are interested in knowing the range of activities visitors engage in while in a park or gateway region. Typically, this has been measured by asking respondents to identify from a list of activities those that they or members of their group took part in. However, the Dillman review of the VSP questionnaires raised concerns about this “check-all-that-apply” format. Dillman cited recent experimental research (Smyth et al., 2006) indicating that—compared to a yes/no format for each item—this approach can bias answers towards the earlier categories in a long list3 and result in fewer responses being marked overall. Therefore, in cases where investigators use the check-all format for relatively long lists, the Social Science Program will recommend that a yes/no format for each item be substituted or that the sample be split and the list randomized across sample segments.


Subsistence uses of park resources. The continuation of subsistence activities and traditional lifeways in Alaskan national parklands is mandated by the 1980 Alaska National Interest Land Conservation Act (ANILCA). This act creates a unique administrative and legal context for social science in the ANILCA units of the National Park System, particularly for studies of subsistence uses of fish, game, birds, plants, fuelwood, and other natural resources. Title 8 of ANILCA states, “The Secretary [of the Interior], in cooperation with the State and other appropriate Federal agencies, shall undertake research on fish and wildlife and subsistence uses on the public lands . . .” Thus, NPS managers of ANILCA lands have a mandated responsibility to conduct social science studies of subsistence use. This research is critical for understanding the effects of subsistence use on parks, the effects of park policies and recreational visits on subsistence, and in developing subsistence management plans for parks.


Subsistence surveys typically target native Alaskans (indigenous and otherwise) living in designated “residence-zoned” communities within and adjacent to national parks. Questions included in these surveys quantify a household’s use of park resources at the species level and contribute to an understanding of the factors affecting subsistence use in a changing social, economic, and environmental context. These factors include a household’s economic base (which affects the need for subsistence resources) and ties to kinship networks (the primary mechanism for distributing harvested resources and determining the principal harvesters in a network).


Predicting and explaining activities and uses. To effectively manage the effects of activity participation on park resources and visitor experiences, it is often useful to understand why specific activities are done. Although many social psychological frameworks can be applied to predicting and explaining activity participation and uses of park resources, one of the most widely employed in NPS research is the Theory of Planned Behavior (TpB). Attachment D contains an example of a study employing facets of the TpB.4


TpB is a robust theory that has received widespread application and support within the social sciences. In a recent review, Francis et al. reported that over 600 studies published in PsychINFO from 1985 through January 2004 utilized this theory.


According to the TpB, individuals’ behavior is influenced by: 1) their attitudes toward the behavior; 2) their subjective norms regarding the action; and 3) their perceived control over engaging in the behavior (Ajzen, 2005). Each of these components is composed of other factors. Specifically, attitudes toward a behavior are influenced by beliefs about the outcomes of engaging in that behavior, weighted by evaluations of outcomes as positive or negative (Ajzen, 2005). Subjective norms are based on an individual’s “normative beliefs,” which are comprised of beliefs about what people who are important to the individual think should be done in a particular situation, weighted by the individual’s motivation to comply with these people (Ajzen, 2005a). Finally, perceived control is a function of “control belief strength” and “control belief power” (Ajzen, 2005).

A schematic of the TpB is represented in Figure 1.



In order to evaluate the internal reliability and validity of concepts included within the TpB, it is desirable to have at least three, and preferably more, indicators of each. Thus, TpB surveys often employ multiple items to measure the underlying constructs (Babbie, 2001; DeVellis, 2003; Foddy, 1993; Fowler, 1993; Noar, 2003).


The TpB has received application within the field of recreation research (Fishbein & Manfredo, 1992). This includes applications to understanding public attitudes regarding wildfires in national parks (Bright et al., 1993; Manfredo et al., 1990), power boaters’ compliance with posted speed limits (Aipanjiguly, Jacobson, & Flamm, 2003), understanding camping behavior (Young & Kent, 1985), examination of hunting intentions (Hrubes, Ajzen, & Daigle, 2001), and compliance with leash laws (Nesbitt, 2006). The TpB also has been recognized as a theory suitable for application to understanding the efficacy of visitor education in natural areas (Marion & Reid, 2001). In the NPS, it has been applied to understanding backpackers’ compliance with Leave-No-Trace principles, visitors’ likelihood of using alternative transportation, and visitors’ response to different interpretive messages.

Topic Area 4 - Individual Expenditures

Although not an exhaustive list, many questions falling under this topic area can be divided into two sub-topics: 1) individual expenditures in time and/or dollars that occur when visiting parks and surrounding areas; and 2) expenditures in time and/or dollars that people would be willing to incur during future visits to a park or surrounding area.


Actual expenditures. Expenditures in parks or gateway regions are typically reported for an individual or group in standard industry categories, including lodging, food, transportation, and other goods and services. (An example of questions used to measure visitor expenditures is included in Attachment D.)


In addition to reporting simple spending totals, data obtained from these questions can be used to estimate the impact of visitor spending on the economies of gateway regions. One frequently used approach is to apply the NPS Money Generation Model—version 2 (MGM2) to primary expenditure data collected from visitor surveys. The MGM2 is a conservative, peer-reviewed assessment tool developed by Dr. Daniel Stynes of Michigan State University that estimates the contribution of spending by park visitors to local sales, income, and jobs. The MGM2 is based on IMPLAN, the most common economic impact model employed by Federal resource management agencies (Stynes, 2005).


When included in NPS Visitor Services Project surveys, non-response to the expenditure question is usually less than 10%. Item non-response within the question is handled conservatively by treating lines left blank as zeroes, rather than as missing values. This decreases overall spending averages by about 7% compared to treating blanks as missing. Outliers are evaluated and either omitted or adjusted. Spending averages with and without outliers are reported. When available, gross receipts from concession services in corresponding categories (e.g., lodging) provide an approximate check on spending estimates based on survey data. These comparisons have been quite close. In the most recent report for Yellowstone National Park, the survey-based estimate of in-park spending was about 5% less than reported concession receipts, well within the 95% confidence interval of the visitor spending estimate.


The major use of the MGM2 estimates by parks is to demonstrate to partners in gateway regions the economic activity attributable to spending by park visitors. Given this use, the NPS considers the level of accuracy in the estimates to be acceptable. For significant policy or planning applications (i.e., estimating the economic impact of alternatives in an EIS), NPS recommends that other approaches be used (i.e., trip diaries, sales tax data).


Willingness to pay. By prior agreement with OMB (Attachment F), willingness-to-pay (WTP) questions included in surveys conducted under the NPS programmatic approval are limited to goods and services currently or potentially provided by the NPS, its cooperating associations, concessioners, and other partners. Questions concerning willingness to pay for non-market goods and services, such as clean air and water, are excluded from this program of studies.


Examples of WTP questions that might appear in surveys conducted under the programmatic approval include willingness to pay for park entrance fees, shuttle service, and other items that are relevant to the mission, management, and/or operations of the NPS. When reviewing such questions, the Social Science Program will work with investigators to ensure that items include enough context, such as the payment vehicle, to allow respondents to make an informed decision. Willingness-to-pay questions about proposed fees should clearly describe what the fees are to be used for so that respondents can judge for themselves whether or not the fee is appropriate.


Topic Area 5 - Individual Evaluations of Park Services

Questions concerning individual evaluations of park services are central to the “visitor enjoyment” component of the NPS mission. Answers to these questions assist managers and planners in determining if park services are meeting visitors’ needs. Ratings of services and facilities provided by the NPS, concessioners, cooperating associations, and other partners in parks and gateway communities are included in this topic area. Typical services and facilities evaluated include exhibits, visitor centers, signage, restrooms, concession facilities, brochures, campgrounds, shuttle systems, interpretive programs and tours, and park websites.


The Social Science Program anticipates that many questions in this category will be based on an “importance-performance analysis” (IPA) framework (Martilla & James, 1977). The IPA method was developed originally in marketing and is widely used in quality management. This includes applications to services in national parks (Tong & Moore, 2006).


In a national park context, questions included under this topic include importance and quality/satisfaction ratings of services which individuals used, or could have used, during a visit to a park or nearby area. In the most common format, ratings of importance (“not important” to “extremely important”) and ratings of quality (“very poor” to “very good”) or ratings of satisfaction (“terrible” to “delighted” or “very unsatisfied” to “very satisfied”) are obtained for services used by visitors.5 Responses are usually analyzed to determine if gaps exist between services that are important to visitors and their perceived quality/satisfaction ratings for these services. The optimal situation exists when services important to visitors are also rated high in performance. In contrast, low performance ratings for important services or high performance ratings for unimportant services indicate a need to for park management to re-allocate effort and resources.


Topic Area 6 - Individual Perceptions of Their Park Experiences

Individual perceptions of park experiences include the public’s awareness and observations of natural and social environments in parks and nearby areas. Examples are visitors’, potential visitors’, and nearby residents’ perceptions of the social and psychological benefits provided by parks and nearby areas, including opinions about visitor use and resource conditions and how these influence enjoyment. Nearby residents’ perceptions of parks and the ways communities are affected by them fall within this topic area.


Under the NPS programmatic approval, questions about individual perceptions are limited to issues that the park or the NPS can manage, independently or in concert with partners. Thus, asking visitors’ about interactions with members of their own group is inappropriate; but questions about how experiences are affected by visitation levels, development levels, or the natural resource conditions of a park are within the scope of this topic.


Individual perceptions of park experiences can be divided into a large number of sub-topics. Rather than attempt to enumerate and describe all of these, three of the most commonly used categories of questions fitting within the topic are discussed below. These include: 1) measures of crowding perceptions; 2) questions related to the NPS Visitor Experience and Resource Protection (VERP) planning process; and 3) questions measuring place attachment.


Crowding perceptions. A vital interest of NPS managers, arising from the legal mandate to establish visitor carrying capacities, is in measuring visitors’ perceptions of crowding.6 Crowding is defined in the social science literature as a negative evaluation of social density. Research in outdoor settings over the past four decades has shown this evaluation to be context-dependent. For example, relatively high social densities may be acceptable to visitors at scenic overlooks in the frontcountry, but the same social density in the backcountry may be unacceptable. Even in backcountry and wilderness areas, social densities may be evaluated differently, depending on whether one is at a trailhead or an overnight campsite.


Visitors often are able avoid exposure to unacceptably low or high densities by voluntarily “displacing” themselves to times and areas in a park that correspond to their desired experiences. Preserving a diversity of settings characterized by different levels of social density is an important visitor capacity management tool because it offers people the freedom to choose the location or time in a park most enjoyable to them. The questions in this category help maintain this ability by telling managers which areas of a park visitors evaluate as crowded and uncrowded.


In the vast majority of cases, visitor capacities are not set for an entire park, but for specific areas within a park based on legal mandates (e.g., wilderness), physical capacity (e.g., of parking lots), and prescribed visitor experiences (e.g., primitive vs. developed areas). Large parks seek to offer a range of capacities. Only in the case of the smallest units of the National Park System (i.e., an historic house) would a single visitor capacity be set for an entire park. In this case, the capacity depends on straightforward physical limits, such as the number of visitors who can be accommodated in a room during a guided tour.


All questions in this section provide information to managers that allow them to identify where and when park visitors feel crowded or uncrowded and how visitors cope with unacceptable levels of social density. They are essential to meet statutory responsibilities and to fulfill the NPS mission to provide enjoyable visitor experiences while maintaining resources unimpaired for the enjoyment of future generations.


Unless there are good reasons to do otherwise, the Social Science Program recommends that studies conducted under the NPS programmatic approval employ a single 9-point scale to measure crowding perceptions. This recommendation is based on a recent review conducted by Shelby and Vaske (in press). The review is the most comprehensive look at crowding measures to date. It examined perceived crowding using 615 evaluation contexts obtained from 181 studies that used a 9-point scale ranging from “not at all crowded” to “extremely crowded” (Heberlein & Vaske, 1977). The results support the utility of the crowding measure in the diverse contexts characteristic of the National Park System. According to the authors:


In conclusion, comparisons of aggregate data from 181 studies showed that the single-item measure of perceived crowding continues to be useful in a variety of situations. It varies with a number of factors that influence use. It provides useful comparative data that allow managers to understand better the carrying capacity challenges that face them and give investigators an idea about what kinds of studies would be most useful.


A complementary and widely used approach when measuring crowding perceptions is to ask respondents how the number of people they encountered relates to their expectations or preferences for contacts. Specifically, visitors are asked if they encountered more, fewer, or about the same number of people as they expected or preferred during a visit. Because perceived crowding involves a cognitive evaluation of encounters, models including expectation and preference variables better explain crowding perceptions than the number of contacts alone (Shelby et al., 1983; Kuentzel & Heberlein, 2003).


Visitor Experience and Resource Protection. Understanding visitors’ impacts on both resources and experiences, together with how they perceive these impacts, is paramount to the management of NPS units. Perceptions of crowding, use limits, and social and environmental impacts have been the topic of social science research for decades and are an important contribution to a park’s VERP planning process.


The VERP process involves five stages7 (Hof et al., 1994). These include identifying and describing: 1) desired future conditions for park resources and visitor experiences; 2) indicators of quality experiences and resource conditions; 3) standards that define minimally acceptable conditions for the indicators; 4) monitoring techniques to determine if and when management action must be taken to keep conditions within standards; and 5) management actions to ensure that all indicators are maintained within specified standards.


In the realm of social science, VERP-related research in the NPS has focused largely on measuring visitors’ standards for minimally acceptable impacts (stage 3). In a typical application to identifying standards (in this case, for visitor experiences), respondents are asked about acceptable limits for encounters, usually by having visitors respond to different levels of crowding displayed in a series of photographic simulations (Manning et al., 2001). Visitors judge photos according to several dimensions, including “acceptability” (very unacceptable to very acceptable), “preference” (which photograph shows the level of use the respondent would prefer), and “management action” (which photograph shows the highest level of use that the NPS should allow before it limits use). A final question asks respondents which photograph represents the typical number of visitors they saw during their stay. Reviews of these studies (Dowert et al., 2004; Manning et al., 1996) have determined that visitors are capable of perceiving various impacts on resources and experiences, and that they discriminate between their personal preferences for a condition (usually an exacting standard) and the level at which parks should take management actions to keep indicators within standards (usually a more tolerant standard).


In addition to providing input on indicator standards, VERP-related research has been used to gather public input on desired future conditions (stage 1), on indicators of quality experiences (stage 2), and to determine if standards for indicators are being exceeded (stage 4). Visitor surveys also help parks determine which social and biophysical factors influence perceptions of quality experiences identified in stage 3. These may include visitor behavior, the amount, type, timing, and location of visitor use, and indicators of resource conditions, such as erosion, soil compaction, and vegetation health. This research suggests on-the-ground actions that might be taken to maintain visitor-experience indicators within acceptable standards.


To date, VERP research has focused primarily on crowding, but the approach has been usefully applied to other social and natural resource contexts, including ecological impacts on campsites and trails (Shelby et al., 1988; Manning et al., 1996), minimum stream flows (Shelby & Whittaker, 1995), and soundscape management (Newman, 2006).


Attachment D contains examples of crowding and VERP questions.


Place attachment. A popular measure employed to study facets of individuals’ perceptions of park experiences is “attachment to place.” The psychological notion of recreationists developing psychological bonds to physical locations has remained a topic of interest for researchers for years (Kyle et al., 2003; Williams & Roggenbuck, 1989; Williams & Vaske, 2003).


The construct of place attachment measures two dimensions: place identity and place dependence. Because place attachment is a complex construct, a larger number of items are required in order to ensure internal reliability and validity. Many recent publications outline well-tested measures, including format, layout, and number of items (Kyle, Absher, & Graefe, 2003; Kyle et al., 2003; Williams & Vaske, 2003).


Measures of individuals’ place attachment are used to understand why people visit specific areas in or near a park and what their expectations are when visiting. Place attachment has been shown to predict environmentally responsible behaviors (Vaske & Kobrin, 2001) and is important when considering public response to certain management actions, including the response of residents living near parks. Attachment D contains an example of a place-attachment scale used in recreation research. The following website provides a bibliography that chronicles the evolution of the place-attachment concept: http://pegasus.cc.ucf.edu/~janzb/place/placesense.htm


Topic Area 7 - Individual Opinions on Park Management

Individual opinions on park management include the ideas, beliefs, attitudes, preferences, and values that visitors, potential visitors, and residents of communities near parks express regarding all aspects of NPS park management. Included in the scope of this topic are individual opinions about how parks manage natural and cultural resources, maintain physical structures, interact with community partners, guide human uses of park resources and facilities, and provide educational and other services to visitors, potential visitors, and residents of communities near parks. Also included are questions measuring the trust that visitors or nearby residents have in the NPS. Information produced by these questions are fundamental to successful civic engagement and other community outreach activities of individual parks and the NPS.


Residents of communities near parks actively engage the NPS not only as visitors, but in other ways. As discussed previously, in Alaska some residents live in designated subsistence communities within or adjacent to national parklands. Others are park concession employees, in-holders, or employees of cooperating associations. Some serve as partners in community-based organizations involved in natural and historical conservation or preservation. These latter organizations comprise areas that are closely affiliated with the National Park System, including National Heritage Areas, National Scenic Trails, National Heritage Corridors, and similar places designated by Congress. Questions asking these residents’ opinions about NPS management of parks and its interaction with related areas inform the creation and maintenance of productive partnerships between individual parks, nearby communities, and affiliated areas.


References for Part A2.

Ajzen, I. 2005. Attitudes, personality and behavior (2nd ed.). New York: Open University Press.

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Performance, 50, 179-211.

Aipanjiguly, S., Jacobson, S., & Flamm R. (2003). Conserving manatee: Knowledge, attitudes, and intentions of boaters in Tampa Bay, Florida. Conservation Biology, 17(4), 1098-1105.

Babbie, E. (2001). The practice of social research (9 ed.). Belmont, CA: Wadsworth.

Bright, A.D., Fishbein, M., Manfredo, M., & Bath, A. (1993). Application of the theory of reasoned action to the National Park Service's controlled burn policy. Journal of Leisure Research, 25(3), 263-280.

Brown, P.J., Driver, B.L., & McConnell, C. (1978). The opportunity spectrum concept and behavioral information in outdoor recreation supply inventories: Background and application. Integrated Inventories of Renewable Natural Resources: Proceedings of the Workshop. (USDA Forest Service General Technical Report RM-55). Fort Collins, CO: Rocky Mountain Forest and Range Experiment Station.

DeVellis, R. F. (2003). Scale development: Theory and application. Thousand Oaks, CA: Sage Publication.

Dowart, C.E., Leung, Y., & Moore, R. (2004). Managing visitors’ perceptions. Parks and Recreation (May). http://www.nrpa.org/default.aspx?documentId=916 (accessed September 17, 2007).

Driver, B.L. (1976). Quantification of outdoor recreationists’ preferences. Research Camping and Environmental Education. Penn State Series II, University Park, PA: Penn State University.

Fishbein, M., Manfredo, M. (1992). A theory of behavior change. In M.J. Manfredo (ed.), Influencing Human Behavior, 29-50. Champaign, IL: Sagamore Publishing.

Foddy, W. (1993). Constructing questions for interviews and questionnaires: Theory and practice in social research. Cambridge: Cambridge University Press.

Fowler, F. J. (1993). Survey Research Methods (Revised ed. Vol. 1). Newbury Park, CA: Sage Publications.

Francis, J. J., Eccles, M. P., Johnston, M., Anne, W., Grimshaw, J., Foy, R., et al. (2004). Constructing questionnaires based on the Theory of Planned Behavior: A manual for health service providers. Newcastle upon Tyne, United Kingdom: University of New Castle.

Hammitt, W. E., Backlund, E. A., & Bixler, R.D. (2004). Experience use history, place bonding and resource substitution of trout anglers during recreation engagement. Journal of Leisure Research, 36(3), 356-378.

Heberlein, T.A., & Vaske, J. (1977). Crowding and visitor conflict on the Bois Brule River (report WISC WRC 77-04). Madison, WI: University of Wisconsin Water Resources Center).

Hof, M., Hammett, J., Rees, M., Belnap, J., Poe, N., Lime, D., & Manning, R. (1994). Getting a handle on visitor carrying capacity—A pilot project at Arches National Park. Park Science, 14(1), 11-13.

Hrubes, D., Ajzen, I., & Daigle, J. (2001). Predicting hunting intentions and behavior: An application of the theory of planned behavior. Leisure Sciences, 23, 165-178.

Kuentzel, W.F., & Heberlein, T.A. (2003). More visitors, less crowding: Change and stability in norms over time at the Apostle Islands. Journal of Leisure Research, 35(4), 349-371.

Kyle, G., Absher, J., & Graefe, A. (2003). The moderating role of place attachment on the relationship between attitudes towards fees and spending preferences. Leisure Sciences, 25(1), 33-50.

Kyle, G., Graefe, A., Manning, R., & Bacon, J. (2003). An examination of the relationship between leisure activity involvement and place attachment among hikers along the Appalachian Trail. Journal of Leisure Research, 35(3), 249-273.

Manfredo, M.J., Driver, B.L., & Tarrant, M.A.. (1996). Measuring leisure motivation: A Meta-analysis of the Recreation Experience Preference Scales. Journal of Leisure Research. 28(3), 188-213.

Manfredo, M., J. (Ed.). (1992). Influencing human behavior: Theory and application in recreation, tourism, and natural resources management. Champaign: Sagamore Publishing Inc.

Manfredo, M., J., Fishbein, M., Haas, G., & Watson, A. (1990). Attitudes toward prescribed fire policies. Journal of Forestry, 88(7), 19-23.

Manning, R. Newman, P., Valliere, W.A., Wang, B., & Lawson, S.R. (2001). Respondent self-assessment of research on crowding norms in outdoor recreation. Journal of Leisure Research 33(3), 251-271.

Manning, R., Valliere, W.A., Wang, B., & Jacobi, C. (1996). Crowding norms: Alternative measurement approaches. Leisure Sciences 21(2), 97-115.

Marion, J., & Reid, S. (2001). Development of the United States Leave No Trace program: An historical perspective. In M. B. Usher (Ed.), Enjoyment and Understanding of the National Heritage (pp. 81-92). Edinburgh, Scotland: Scottish Natural Heritage & the Stationery Office.

Martilla, J. A. & James, J. C. (1977). Importance performance analysis. Journal of Marketing, 41(1), 77-79.

Nesbitt, R. K. (2006). Toward an understanding of noncompliant behavior in outdoor recreation: Linking the theory of planned behavior to off-leash dogs at William B. Umstead State Park. Unpublished Masters Thesis, North Carolina State University, Raleigh, NC.

Newman, P. (2006). Understanding and managing research in national parks: Muir Woods soundscape questionnaire. (OMB #1024-0224, NPS #06-049). Technical report forthcoming.

Noar, S. M. (2003). The role of structural equation modeling in scale development. Structural Equation Modeling, 10(4), 622-647.

Office of Management and Budget. Provisional guidance on the implementation of the 1997 standards for Federal data on race and ethnicity. Washington, DC: Office of Management and Budget, 2000. http://www.whitehouse.gov/omb/inforeg/re_guidance2000update.pdf (accessed September 17, 2007).

Shelby, B.J., & Vaske, J. (In press). Crowding as a descriptive indicator and an evaluative standard: Results from 30 years of research. In press.

Shelby, B.J, Vaske, J., & Harris, R. (1988). User standards for ecological impacts at wilderness campsites. Journal of Leisure Research 20(3), 245-256.

Shelby, B., Whittaker, D. (1995). Flows and recreation quality on the Delores River: Integrating overall and specific valuations. Rivers 5, 121-132.

Shelby, B., Heberlein, T.A., Vaske, J., & Alfano, G. (1983). Expectations, preferences and feeling crowded in recreation activities." Leisure Sciences, 6(1), 1-14.

Smyth, J.D., Dillman, D.A., Christian, L.M., & Stern, M.J. (2006). Effects of using visual design principles to group response options in Web surveys.” International Journal of Internet Science, 1(1), 5-15.

Stynes, D. (2005). Economic significance of recreational uses of national parks and other public lands. Social Science Research Review, 5(1), 1-33.

Tong, J., & Moore, S.A. (2006). Importance-satisfaction analysis for marine-park hinterlands: A Western Australian case study. Tourism Management, 28(3), 768-776.

Vaske, J., & Kobrin, K. (2001). Place attachment and environmentally responsible behavior. Journal of Environmental Education, 32(4), 16-21.

Williams, D., & Roggenbuck, J. (1989). Measuring place attachment: Some preliminary results. Paper presented at the National Recreation and Parks Association Conference; Abstracts: 1989 Leisure Research Symposium (p. 32), Arlington, VA.

Williams, D., & Vaske, J. (2003). The measurement of place attachment: Validity and generalizability of a psychometric approach. Forest Science, 49(6), 830-840.

Young, R.A,.& Kent, A.T. (1985). Using the theory of reasoned action to improve the understanding of recreation behavior. Journal of Leisure Research, 17(2), 90-106.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden [and specifically how this collection meets GPEA requirements.].


Individual surveys likely to be conducted under the programmatic approval will vary in the ways they contact the public. Many surveys conducted under this program will require direct contact with respondents, which may preclude the use of information-technology applications. However, some surveys will offer electronic response options, in addition to more traditional response modes. Representative sampling methods will be employed, except in cases where qualitative data from purposeful or other non-probability samples are of legitimate interest (e.g., pre-tests, cognitive interviews, focus groups). However, all surveys will be of populations defined in the previous program: park visitors, potential park visitors, and residents of communities near parks. Under terms of this program, for each proposed NPS-sponsored public survey, the investigators will work with park staff to identify and develop the objectives, scope, and target audience.


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Possible duplication will be examined in the NPS Social Science Program’s technical and administrative review of individual proposals. However, surveys likely to be conducted under the programmatic approval typically provide information specific to a unit of the National Park System, or they supply regional or Systemwide information that meets a specific and timely need of the NPS. In these cases, there are typically no other information sources available which can be used in lieu of the proposed information collection. Other national recreation surveys, such as the National Survey of Hunting, Fishing, and Wildlife-associated Recreation conducted by the Fish and Wildlife Service and the National Survey on Recreation and the Environment carried out by the Forest Service, provide information on the outdoor recreation participation patterns of a national sample of households, but are not on-site visitor studies and do not cover the types of management and planning issues that are of central concern to individual units of the National Park System. The NPS Comprehensive Survey of the American Public is a “big-picture” national household survey that does not collect the finely detailed data that satisfies local information needs, including information on visitors to individual parks. The Forest Service’s “National Visitor Use Monitoring Program” conducts annual on-site visitor studies on national forests so they do not provide the information that is typically needed to support decision-making in the National Park System. Finally, travel industry surveys periodically examine broad trends in leisure travel, but tend to use self-selected groups of respondents recruited to serve on national Web panels. Such surveys do not represent park visitors, potential visitors, or residents of communities near parks and often are limited to Internet users who are usually offered significant incentives for participation.




5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

On occasion, owners or employees of small businesses or organizations may be contacted as part of the individual surveys proposed under this program. However, they will be surveyed in their role as visitors, potential visitors, or residents of communities near parks. Therefore, the burden on small businesses or other small entities will be minimized.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The consequences of not having these data for park planning are considerable. Substantial funds are sometimes spent on public services and facilities based on only one source of information about users—the assumptions of project planners. This can result in waste from several causes, including unnecessary maintenance, over-investment in relatively under-utilized facilities, inefficient public safety and health support, poor understanding of gateway communities, degradation of facilities and resources due to poor management of visitors, and interpretive media that fail to communicate effectively with diverse audiences. To minimize these avoidable problems, NPS policies require park plans to be constructed from reliable information bases (NPS Management Policies, 2006). Up-to-date data on park visitors, potential visitors, and residents of communities near parks are not available for many units of the National Park System, except through the types of information collections proposed under this program.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

Individual surveys likely to be conducted under this program are typically non-recurring, and none of the situations described are involved. However, potential special circumstances will be considered in the NPS Social Science Program’s technical and administrative review for individual surveys proposed under the programmatic approval.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. [Please list the names, titles, addresses, and phone numbers of persons contacted.]


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


As required by 5 CRF 1320.8(d), the agency’s 60-day notice appeared in the Federal Register on March 27, 2007 (Vol. 72, No. 58, Page 14295 – 14295). This notice solicited comments on the proposed extension of an existing information collection (See Attachment G, 60-day Federal Register notice). In addition, individuals who had served as principal investigators on NPS-sponsored public surveys in FY 2005 and FY 2006 were informed that the 60-day Federal Register notice had been published. The comment period closed on May 29, 2007. The NPS received three public comments as a result of the publication of this 60-Day Federal Register notice and the subsequent notice to investigators.


One commenter, responding to the notice, stated that sufficient information has been collected over the eight years of the programmatic approval, and that the program should be discontinued. In response, it’s necessary to point out that the information collected is unique, as the needs of parks continue to change. The NPS Social Science Program conducts a detailed review of all information collections submitted under the programmatic approval process to ensure that studies are not duplicated and that the information being collected is useful and relevant to the current management of NPS units.


A second comment inquired about the nature of the programmatic approval. Social Science Program staff explained the process, and the commenter had no further questions.


A final comment was submitted by an investigator who has conducted previous research for the NPS. The researcher outlined a number of concerns with the programmatic approval process, including:

  • the length of time a submission sometimes spends in the review process;

  • the inability of investigators to conduct methodological work;

  • reluctance by NPS/OMB to accept certain research approaches;

  • inconsistency in the review process;

  • a need for studies to be able to replicate previous questionnaire designs for comparability;

  • a need to improve communication between the Social Science Program and investigators.


In response, the Social Science Program has taken steps to improve communication with the research community by periodically e-mailing updates informing investigators of changes to the review process (e.g. extended review times, updated contact information, etc.). In addition, on April 19, 2007 the Social Science Program co-sponsored an information-sharing session at the George Wright Society meeting in St. Paul, MN to discuss the programmatic approval with investigators and park staff. During this session, NPS representatives explained the Paperwork Reduction Act and the history and evolution of the programmatic approval. Stakeholders were given substantial time to ask questions about the process and express concerns and/or support. Overall, the stakeholders were appreciative of the program’s ability to allow needed research to be done, while they were concerned about perceived inconsistencies in reviews and the timeliness of obtaining approval. Based on these comments, the Social Science Program is making significant efforts through its strategic planning to enhance its capabilities to review and process submissions and to improve communication with researchers and NPS field staff.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Respondents to surveys to be conducted under the programmatic approval typically are not offered a gratuity for completing a questionnaire. In those cases where remuneration is proposed by investigators, it will be carefully considered in the NPS Social Science Program’s technical and administrative review. If payments or gifts are planned (e.g., for participation in focus groups), the NPS Social Science Program will certify that such a proposal is consistent with best practices in survey research and with OMB guidelines. The proposed use of gifts or incentives will be disclosed in the submission form accompanying any survey instrument submitted to OMB under this program of study. OMB limits on compensation for participation in focus groups ($50.00) or cognitive interviewing ($35.00) will be observed.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

No assurance of confidentiality will be provided to respondents to the surveys conducted under this program. The Department of the Interior does not have the statutory authority to protect confidentiality or to exempt a survey from a request under the Freedom of Information Act. Instead, those who inquire about this issue will be told that their answers will be used only for statistical purposes. They will also be told that reports prepared from this study will summarize findings across individual samples so that responses will not be associated with any specific individuals. Respondents will be informed further that the NPS and its research partners will not provide information that identifies respondents, except as required by law. Indeed, personally identifying information (telephone numbers, e-mail addresses, and postal addresses) is typically stripped from data files before the files are made available to parks or to others parties. Therefore, the administration of surveys conducted under this program is essentially anonymous.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


Typically, no sensitive questions are asked in surveys likely to be conducted under the programmatic approval. However, inclusion of questions of a sensitive nature will be carefully considered in the NPS Social Science Program’s technical and administrative review of individual surveys.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.

An hour burden budget of 15,000 hours is requested for each of the three years of the proposed program. This is unchanged from the previous programmatic approval.  Using the Bureau of Labor Statistics (BLS) national wage information, the most recent published report (June 2006) lists an average hourly wage of $19.29. Further using the BLS benefits scaling factor of 1.3, the average hourly wage with benefits is $25.08. Thus, the estimated annual cost to respondents for the hour burden is $376,155.8


13. Provide an estimate of the total annual [non-hour] cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There are no costs to respondents beyond the time needed to respond to the surveys conducted under this program.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.

The costs to the government for surveys likely to be conducted under the proposed program will vary depending on response modes, sample sizes, travel costs, and length of time in the field. In 2007, the average cost of an NPS Visitor Services Project mailback survey was $19,200, including operational expenses, labor, and other non-recurring expenses. Using the VSP average and multiplying that by the number of studies conducted in FY 07 (n=50), recognizing that there are distinct discrepancies in individual survey costs, the estimated annualized cost to the Federal government is $960,000. However, the proposed program represents a significant savings in operational expenses and labor costs over that needed to complete the full PRA approval process. These savings result from the reduced number of Federal Register notices and pubic commenting periods, and less time needed for review by NPS and DOI staffs. In FY 2006, the programmatic approval produced an estimated cost savings of approximately $1,545 per survey (see Attachment C: FY2006 Annual Report).


15. Explain the reasons for any program changes or adjustments reported.


The cost to respondents remains unchanged from the previous program. The hour burden request has not changed from 15,000 per year as previously approved.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Because data collected under the programmatic approval are intended to address concrete management and planning issues, most analyses of surveys conducted under this program will involve simple tabulations. These include response frequencies, means, standard deviations, confidence intervals, and breakdowns of these by important sub-groups of respondents. When collected, expenditure data may be input into the NPS Money Generation Model—version 2 in order to estimate the economic impact of visitor spending on gateway regions. In some cases, more complex multivariate statistical analyses may be performed, as when estimating coefficients for models based on the Theory of Planned Behavior. In other cases, data from qualitative studies may involve transcriptions of interviews or focus group discussions, followed by content analyses identifying general themes.


The results of surveys likely to be conducted under this proposed program will be presented in internal technical reports, at conferences, and in peer-reviewed literature. Copies of technical reports will be archived with the NPS Social Science Program and entered online into the Social Science Studies Collection, a part of the NPS Focus digital library. Currently, 330 technical reports and other documents are publicly accessibly through the Social Science Studies Collection. New reports will be added as they become available.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


We are not seeking such approval.


18. Explain each exception to the certification statement.

There are no exceptions to the certification statement.


1 In 1988, and for a number of years thereafter, the VSP operated with a catalog of specific questions, rather than with a list of pre-approved question categories. At that time, OMB required NPS to use these questions word-for-word, even with typographical, punctuation, and other errors. The significant problems created by this approach resulted in the first NPS Programmatic Approval in 1998 being built around question categories rather than specific questions.

2In 2007, the NPS contracted with Dr. Don Dillman to review its current VSP questionnaire. Dr. Dillman provided suggestions and feedback that are detailed in Part B of this supporting statement. His comments were largely supportive of the current format and question wording. While other park surveys often differ from the VSP, many incorporate similar questions. The Social Science Program will continue to encourage investigators to use VSP wording for commonly asked questions concerning individual characteristics.


3 The “check-all” list cited in the Dillman report included 16 items.

4 As with all the materials presented in Attachments D,, this is an example. The application of the theory will depend on the nature of the study and the concepts being investigated.

5 A consensus has yet to emerge on whether performance measured by quality ratings or performance measured by satisfaction ratings is the best approach to IPA. The Social Science Program will accept either, since both are reported in the peer-reviewed literature.

6 The 1978 General Authorities Act (PL 95-625) requires each park’s General Management Plan to include identification of and implementation commitments for visitor carrying capacities for all areas of a unit.

7 The VERP process is adapted from the “Limits of Acceptable Change” framework developed by the US Forest Service and tailored to the mission of the NPS.

25


File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submissions
AuthorBrian Forist
Last Modified Bymmcbride
File Modified2008-06-03
File Created2008-06-03

© 2024 OMB.report | Privacy Policy