1024-NEW SEM Supporting Statement A 3-7-25

1024-NEW SEM Supporting Statement A 3-7-25.docx

Socioeconomic Monitoring Study of National Park Service Visitors

OMB:

Document [docx]
Download: docx | pdf

Supporting Statement A


Socioeconomic Monitoring (SEM) Study of National Park Service Visitors


OMB Control Number: 1024-NEW

Terms of Clearance: None. This is a new collection


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

From its founding over a century ago, the National Park Service (NPS) has been authorized to collect information to enhance its ability to manage, protect, interpret, and research the resources of the System (54 U.S.C. 100701). More recently and with the intent to identify and promote outdoor recreation opportunities, the Explore Act (PUBLIC LAW 118–234) requires data on outdoor recreation such as existing user characteristics, activities, evaluation of opportunities, and other usable information to better foster and encourage recreation. The NPS currently estimates visitation, but systematic visitor self-reporting methods have yet to be sufficiently implemented.

Social science research in support of park planning and management is mandated by the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). The NPS policy facilitates these studies to protect resources and enhance enjoyment for present and future generations (National Park Service Act of 1916, 39 Stat 535, 16 USC 1, et seq.). This research provides an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources. The agency needs to collect and monitor long-term data on socioeconomic indicators (e.g., demographics, spending, perceptions, trip characteristics) to understand how it is serving the public, inform strategic resource use, and improve visitor experience and non-visitor engagement.

These socioeconomic research needs are emphasized in the NPS strategic goals for science, statements by the NPS leadership, the Second Century Commission report, and the Department of the Interior Strategic Plan for 2018-2022. A 2014 Government Accountability Office report (GAO-15-84 Federal Customer Service) highlighted the need to better understand and monitor customer experience dimensions linked to bureau investments. In response, the NPS initiated its first pilot study for socioeconomic monitoring in 2015-2016 and has since improved and refined the Socioeconomic Monitoring Program (SEM).

To address the need for comprehensive, agency-wide socioeconomic data, the SEM was developed. This program goes beyond park-specific studies to aggregate social science data at a broader level. The SEM aims to provide consistent, comprehensive, agency-wide socioeconomic data to support park management by understanding the demographics and behaviors of park visitors. Each year, 30 park units, stratified by size and type, are randomly selected for sampling to represent all national park visitors, creating a national database and understanding of NPS visitors.

The NPS's current long-term, annual, agency-wide collections of socioeconomic data from visitors are insufficient for comprehensive understanding and comparison across all units. Thus, a new information collection request is proposed for a robust socioeconomic monitoring program. This collection is needed to provide systematic metrics to monitor services across the entire NPS System.

The SEM, conducted by the NPS Social Science Program (SSP), involves annual data collection at 30 parks across the country, aiming to create a representative sample of park visitors. Visitors are surveyed during peak visitation months, and national aggregation of this data has not been possible as most park studies occur sporadically.

A strong mandate for socioeconomic monitoring is expressed in NPS strategic goals, leadership statements, and the Second Century Commission report. An external review of the NPS Social Science Program, supported by the 2008 Interior Appropriations Bill Joint Explanatory Statement, also identified this need. In response, an NPS Socioeconomic Monitoring (SEM) working group developed a mission, goals, and objectives emphasizing high-quality social science trend data collection for NPS managers.

In 2008 and 2009, the NPS SSP conducted a systemwide needs assessment for socioeconomic monitoring. Over 400 participants from parks, regions, programs, and the Washington Office identified and rated 175 indicators as useful. These indicators were organized into a hierarchy of categories: visitors and other people in parks, people outside parks, economy, and perceptions, attitudes, and values. These indicators were rated on management, social science, and law and policy significance, resulting in a prioritized list.

Feedback from seven regional scoping workshops highlighted the value of SEM in achieving the NPS mission, the importance of core indicators at various levels, the usefulness of trend data on non-visitors, and concerns about definitions of “traditional” people or cultures. Each region also identified specific monitoring needs. The SEM working group created a logic model to show how selected indicators support the NPS mission by measuring progress toward strategic goals.

From 2014-2017, the SSP developed a pilot SEM program1 at 14 parks to test and refine the approach. The implementation plan used lessons learned to produce a standardized approach for full operation. A list of 30 parks per year was generated using a systematic random sampling framework, with peer-reviewed processes justifying park selection.

In 2021, the SEM was implemented as a phase two pilot program2 at 24 parks across the country. The first-year implementation identified the need for potential revisions in park selection and process. The original SEM draw procedure was retained with minor modifications for adding and removing parks, setting visitation thresholds for new additions, and adjusting sample size targets for small parks.

Building on the lessons learned from the 2014-2017 pilot program and the 2021 phase two implementation, the National Park Service's Socioeconomic Monitoring Program (SEM) is crucial for providing comprehensive, agency-wide data on park visitors and non-visitors to enhance management, resource protection, and visitor experience across the national park system.

Legal and Administrative Justifications:

  • 54 USC 100701 National Park Service and Related Programs: Protection, interpretation, and research in System (formerly The National Park Service Act of 1916)

  • 54 USC 100702 National Park Service and Related Programs; Research Mandate

  • Public Law 118-234 Expanding Public Lands Outdoor Recreation Experiences Act

  • NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”)

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

2(a) Need/Authority and Use of Collection

A strong mandate and need for Socioeconomic Monitoring (SEM) exist within the National Park Service (NPS). This mandate is reflected in NPS strategic goals for science, as emphasized by NPS leadership and the Second Century Commission. Additionally, legislation from both current and past administrations has promoted outdoor recreation through investment in infrastructure, programming, public engagement, outreach, and accessibility. Initiatives like America’s Great Outdoors, Every Kid Outdoors, the Great American Outdoors Act, and the Inflation Reduction Act highlight the need for detailed information on visitors to NPS units. SEM also supports environmental justice considerations as outlined in the Justice40 Initiative by enabling the NPS to understand visitor demographics and characteristics over time.

The current lack of understanding and long-term trend data on visitors limits the bureau’s ability to assess how these efforts impact resources and visitor experiences. Customer experience remains a critical aspect of NPS services. SEM will provide a more rigorous means to understand visitor satisfaction and quality, complementing other data collections. It will also inform planning processes required by the National Environmental Policy Act and the National Parks and Recreation Act regarding visitor use and carrying capacity.

Monitoring requires institutional continuity, long-term planning, and predictable resource commitments. In response to this need, an NPS working group developed a mission, goals, and objectives for socioeconomic monitoring in the National Park System. The basic principle is that an SEM program must address the priority social science information needs of park managers.

Understanding the dynamic relationship between people and parks is one of the most challenging tasks facing NPS managers. This understanding is crucial to protect natural and cultural resources, provide public enjoyment opportunities, and maintain the relevance of parks to a diverse America. The NPS Management Policies (2006) state that decision-makers and planners should use the best available scientific and technical information. However, park managers often lack current information on the affected publics, making it difficult to make informed decisions.

The NPS Social Science Program (SSP) conducts and promotes state-of-the-art social science related to the NPS mission, providing usable knowledge to managers and the public. The primary mission of the SEM program is to collect, organize, and make available high-quality social science data, transforming it into usable knowledge.

For monitoring purposes, socioeconomic trend data are organized into four categories relevant to park management:

1. Visitors and people in parks: Numbers, behaviors, and observable characteristics.

2. People outside of parks: Numbers, behaviors, and characteristics.

3. Economy: Park assets and economic impacts on surrounding areas.

4. Perceptions, attitudes, values, and self-reported characteristics: Inside and outside parks.

While most trend information in the first three categories is available from existing sources, monitoring trends in the fourth category requires original data collection through surveys, interviews, focus groups, and other social science methods. The SEM program aims to fill gaps in this category using established survey methods in parks and protected areas. In other words, this collection does not pertain to non-users.

SEM data will be used by NPS leadership and park managers to understand public experiences in national park units. This data will create visitor profiles, capturing demographics, experiences, spending, and usage patterns across the system. Park managers can compare their site-specific data with regional and national trends, enabling informed, data-driven planning and management decisions. Insights into visitor priorities from SEM data will help align management decisions with these priorities.

Each participating park will receive a specific report based on SEM data, which managers can use to enhance services and facilities. Understanding primary activities and the importance placed on various services will help prioritize recreation programming and infrastructure investments. SEM data will also help the NPS understand visitation differences among various populations, informing strategies to increase access and visitation among underserved groups.

SEM data will contribute to various reports, including the Visitor Spending Effects (VSE) report, allowing comparison of visitor demographics and socioeconomic characteristics over time. Most parks currently rely on outdated VSE data from surveys conducted between 2003-2012; SEM will provide updated and reliable metrics, including length of stay and number of recreation visits.

2(b) The Mandate

Socioeconomic monitoring (SEM) addresses the need of managers, planners, and policymakers for timely and accurate information on people in and around parks. SEM’s purpose is to advance the mission of the NPS by generating and communicating scientifically sound information about trends in how people and parks connect. In this way, SEM helps to achieve several key NPS priorities and strategic goals for science, as articulated by NPS Leadership. These include:

  • Advancing the role of science within the agency and developing a state-of-the-art NPS science program;

  • Elevating and expanding the use of science in decision-making by the NPS and its partners; and

  • Providing the information NPS managers need to measure and promote the continued relevancy and connection of parks with the American people.

Socioeconomic monitoring also contributes to improved dialogues between the NPS and communities. "The impact of tourism to national parks is undeniable: bringing jobs and revenue to communities in every state in the country and making national parks an essential driver to the national economy,” said National Park Service Director Chuck Sams." By providing up-to-date information on the economic contributions of visitor and park payroll spending to gateway regions, SEM documents a tangible value of park preservation to communities and demonstrates the vital role that parks play in achieving environmental, economic, and quality-of-life goals in parks and surrounding areas.

2(d) Need and Guiding Research Questions


Monitoring is the collection and analysis of repeated observations or measurements to evaluate changes in condition and progress toward meeting a management objective. A small number of social science activities occur already in the NPS. Some of these, such as the NPS Comprehensive Survey of the American Public, can be conducted only sporadically as funding becomes available. Others, including the annual Visitor Spending Effects economic impact tool, rely on standard questions proposed within this collection to persist. And others, such as the historic Visitor Services Project survey program, were not designed to meet the Service’s long-term monitoring needs. This is an impractical approach to monitoring socioeconomic trends. Monitoring requires institutional continuity, long-term planning horizons, stability in methods and samples, and predictable commitments of funds and personnel. This ensures that, over time, monitoring protocols will continue to meet defined standards of quality, adequately characterize uncertainty, and usefully measure change along socioeconomic gradients, even with turnover in staff.

If the vision for socioeconomic monitoring in the NPS is realized, for the first time in its history managers in the NPS will be able to answer such important questions3 as:

RQ1: Do visitor characteristics differ across park unit types (e.g. pre-trip planning, motivations, group types, in-park activities, transportation access mode, length of stay, spending, experience/use history, etc.)?  

RQ1a: Are differences evident by geography or other biophysical attributes (e.g. state, regions, etc.)? 

RQ2: Demographically, are park visitors representative of the U.S. population? 

RQ2a: Does underservice of U.S. populations have a discernable pattern in what we can see geographically and/or by park type?  

RQ3: Do NPS services, facilities, and recreational opportunities provide quality to the visiting public? 

RQ4: Do visitor responses align with the tenants of the NPS Mission Statement, including preservation of natural and cultural resources, pursuit of enjoyment, education, and inspiration, and commitment to both current and future generations? 

RQ5: What trends over time can be seen in visitor demographics, trip planning and visitation participation characteristics, and visitor experiences? Do these trends bring the NPS closer to achieving their mission? 

RQ6: How can the NPS continually increase accuracy of visitor spending effects estimates and visitor use statistics estimates based on empirical testing at the selected park units per year? 

Regarding RQ6 above, additional questions with a focus on economics and economics-related topics provide more nuanced understanding of the vital and complex role of the NPS in providing a valued park visitor experience and benefiting national, regional, and local economies. Some questions of economic focus are not to be initially released in the anticipated annual park-level and national SEM reports, but they will provide valuable information in other ways (e.g., economic contribution model inputs). They may benefit from further justification, as seen as follows:

Economics Research Questions (ERQ):

ERQ1: What is the economic value that park visitors derive from recreation opportunities in national parks?  
Justification: Although the NPS provides some of the most sought-after recreation opportunities, little is known about the economic value - i.e., consumer surplus - that park visitors derive from recreating in national parks generally, and from participating in specific recreation activities in parks, such as biking, hiking, and fishing. Such values are increasingly important given new Executive Orders, such as EO 14072, which directs the establishment of the first government-wide natural capital accounts to measure the economic value that natural assets provide to society, as well as new OMB guidance on valuing ecosystem services in benefit-cost analysis. Without this information, the NPS is forced to rely on consumer surplus estimates from studies conducted on other types of public lands, such as national forests, when conducting regulatory impact analyses and other benefit-cost analyses to inform park management. Visitor experiences at other public lands can be quite dissimilar from those at NPS sites. As a result, such studies may not adequately capture the value of recreation at unique national park sites (Heberling and Templeton, 2009; Kaval, 2007). Further, these existing studies are often outdated, failing to capture changes in visitor preferences for national parks over time. This continued reliance on benefit transfer approaches increases the uncertainty in NPS’ benefit-cost analyses. A contingent valuation question querying respondents about their willingness-to-pay increased trip costs to visit the NPS unit that they are currently visiting when intercepted for the survey would fill this data gap. This question has been tested in the Phase I pilot version of this survey4 and a version of this question has been regularly included in National Wildlife Refuge visitor surveys5 conducted by the U.S. Fish and Wildlife Service.  

Additional questions in the survey asking about visitor participation in different recreation activities would allow for the estimation of consumer surplus values for specific recreation activities, if sample sizes are sufficient. This would greatly improve the NPS’ ability to conduct accurate cost-benefit analyses, under regulations such as the agency’s Bicycle Rule (36 CFR 4.30). This Bicycle Rule requires a special regulation to authorize bicycle use on new trails outside of developed areas, necessitating the need for consumer surplus estimates for bicycling in national parks specifically.  

ERQ2: What are best practices for accurately estimating consumer surplus values for recreation in national parks using travel cost models?

Justification: Estimating consumer surplus values using a single-site travel cost model can be accomplished with basic information from a sample of visitors, including the number of trips taken to the recreation site in the past 12 months, the distance traveled to the recreation site, whether visiting the site was the primary purpose of the trip away from home, and income. However, travel cost models and resulting estimates of consumer surplus are highly sensitive to the assumptions made. For example, there is little consensus in the literature on whether to include vehicle depreciation costs in the per-mile cost of travel or not, which can have a large effect on resulting welfare measures (Hang et al., 2016; Duffield et al., 2011). Including a contingent valuation question in the survey allows for a more straightforward estimate of willingness to pay. This estimate would be compared to an estimate based on the travel cost method under different modeling assumptions (i.e., tests of convergent validity) to inform best practices in travel cost models for valuing park visitation.

ERQ3: How does spending by park visitors contribute to nearby local and regional economies?

Justification: Estimates of the effects of NPS visitor spending on local and national economies serve as an indicator of one of the many ways that parks benefit communities and the American public. To document this important benefit, the NPS has been measuring and reporting on the economic contribution of visitor spending for more than 30 years (Koontz et al., 2017). The NPS methods for estimating visitor spending have been replicated and built upon by other Federal agencies including the U.S. Forest Service (White et al., 2013), the U.S. Army Corps of Engineers (Chang et al., 2003), and the U.S. Fish and Wildlife Service (Carver and Caudill, 2013), as well as other countries including Germany (Mayer et al., 2010), Finland (Huhtala, Kajala, and Vatanen, 2010), and Brazil (Souza, 2016). 

Accurate estimation of visitor spending requires high quality visitor survey data representative of the variety of visitor uses and demographics seen across the park system. Many parks currently rely on data from visitor surveys conducted between 2003-2012. The annual supply of new park-level SEM data will provide a reliable way to consistently increase sampling across all park types and geographic regions to improve the accuracy of park-specific trip characteristic and spending data. The new park-level spending profiles developed from SEM visitor surveys will be incorporated in the annual Visitor Spending Effects analysis to update park-level visitor spending and trip characteristic data. 

The answers to these and other questions have immense practical utility for park planning and management, contribute to relationships with gateway regions, respond to frequently asked questions about parks from Congress, stakeholders, and the media, and inform strategic planning and policy at the national level. Moreover, in some cases monitoring may provide useful insights into how well internal and external programs of the NPS are working. In short, results from socioeconomic monitoring are useful in addressing numerous issues in parks, in areas around parks, and in the nation as a whole.

In response to the mandate and need for socioeconomic monitoring, an SEM working group developed the following draft statement of mission, programmatic goals, and operational objectives. A basic principle driving this statement was that the SEM program must respond to the priority social science information needs of park managers.

2(d) Mission and Goals

The mission of the SEM Program is to collect, organize, and make available high-quality social science trend data and to facilitate the transformation of these data into knowledge of value to park managers and other NPS decision-makers. In carrying out this mission, the long-term programmatic goals of the SEM program are to:

  • Develop baseline information describing human populations in parks, regions around parks, and nationally.

  • Conduct long-term monitoring of trends in the socioeconomic characteristics of human populations in parks, regions around parks, and nationally.

  • Establish socioeconomic monitoring as a standard practice throughout the National Park System that transcends traditional program, activity, and funding boundaries.

  • Facilitate the integration of socioeconomic monitoring into NPS planning, management, and decision-making.

  • Share NPS information with other organizations, and form partnerships for attaining common goals and objectives.

2(e) Objectives

The operational objectives of the SEM Program are to:

  • Determine trends in the size, characteristics, and behaviors of human populations in parks, regions around parks, and the nation.

  • Monitor trends in key variables that may explain or predict changes in the number, characteristics, and behaviors of these populations.

  • Monitor trends in factors that may be affected by changes in human populations in and around parks, including trends in visitor experiences.

  • Monitor trends in the quantity and quality of visitor services in parks and gateway communities.

  • Provide reference points on human populations in parks for comparison with regional, state, and national populations.

  • Provide data to meet certain legal and Congressional mandates related to the protection of visitor enjoyment and park resources.

  • Provide a means to measure progress towards achieving performance goals related to visitor enjoyment.

2(f) Brief Methodology Statement

The methodology for the study is a two-phased survey approach that includes a 1) intercept survey with eligible visits on-site at the park unit and 2) a mail-back survey (with online option) that is given to visitors who participate in the intercept survey. This methodology and approach have been used in various contexts including a wide variety of academic research (Jorgenson et. al., 2018; Wilton & Nickerson, 2006). Outside of purely academic research, an intercept with accompanying mail-back survey approach has been used in various studies throughout the tourism and parks and protected world. The State of Montana’s research institute, The Institute for Tourism and Recreation Research, has been employing this methodology for over 15 years with strong success. The methodology aims to capture a large sample from prospective visitors on the intercept and link their accompanying mail-back responses once complete. This methodology captures a large sample size of key information from all prospective visitors and all complete trip details on the mail-back survey. Respondents will be asked to provide contact information (either mailing address or email) to serve as a way to deliver reminders to them. Those that do not choose to provide contact information will still be included in the sample without reminders.

2(g) Individual Question Justifications

The questions asked in this collection are largely drawn from, with some modifications, the NPS’ Pool of Known Questions. We adopted these questions as they have been demonstrated, across multiple studies, to be reliable. Tables 2.1 and 2.2. include justifications for individual questions along with reference to which research question(s) the individual survey question refers. Intercept and mail-back survey questions will remain the consistent across all NPS units with the exception of NPS unit name customization and activity, transportation, etc. options retained/deleted to fit individual park circumstances.

Table 2.1: Intercept Survey Question Justifications

Section/Question

Related RQ/E-RQ

To determine/understand

INTERCEPT SURVEY

Section 1

This information will provide NPS managers with an understanding of visitor profiles related to geographic origin and why visitors choose to travel to the NPS unit. Questions 3-5 will also serve as the non-response bias check.

Q1: Are you a first-time visitor to [NPS site]?

RQ1-2, 5

Prior visits are an important metric in understanding visitor behavior, attitudes, and preferences. This information will help managers build their visitor profile and to determine whether their unit sees more repeat or first-time visitation. This is important to know because first-time visitors differ from repeat visitors in the type of information they require when visiting the park. Additionally, trip characteristics and activity participation may differ between these segments.

Q2: How many visits have you made to other NPS sites over the past 12 months?

RQ1-2, 5

Q3: Are you a permanent or seasonal/second home resident of the local area around [NPS Site]?

RQ1-RQ2


Local residents and visitors from distances further away from the park are likely to differ in their perceptions, as well as trip characteristics. This information will be used to determine the implications of such differences on the visitor spending effects of the NPS unit on the local economy. Sub-questions ask for the ZIP code of respondents who are permanent or seasonal residents. This information is used to assess what areas of the nearby region are represented among local park visitors.

Q4: Do you currently live in the United States?

RQ1-RQ2


This question will be used to assess the geographic origin of visitors from outside the local area to explore differences in their responses. Visitor residency is important for park managers’ understanding of the profile of their visitor base. Further, Detailed residency information is necessary to accurately estimate park units’ contributions to the local economy.

Q5: [not a permanent or seasonal resident] Was visiting [NPS Site] the primary purpose for your overall trip away from home?

RQ6, ERQ1-3

Q5 and its sub-question help inform managers about their non-local visitors’ purposes for traveling to the NPS unit and/or the local area. For some non-local visitors, visiting the NPS unit may be part of a larger trip and not the primary purpose. Capturing this information is important to help calculate accurate visitation statistics and travel decisions of park visitors, as well as to understand what role the park plays in bringing visitors to the area.

Q6: [a permanent or seasonal resident] Was visiting [NPS Site] the primary purpose for your trip away from home today?

RQ6, ERQ1-3

Q6 and its sub-question help inform managers about their local visitors’ purposes for visiting the NPS unit. Capturing this information is important is important to help calculate accurate visitation statistics and to understand what role the park plays for residents of the local area.

Section 2

Questions in this section identify spending and travel patterns in the local region. These questions are important for understanding the overall trip characteristics of park visitors. This information is necessary to accurately calculate the effect visitor spending has on the local economy of the unit.

Q7: How many total days will you spend away from home on your overall trip that includes your visit to [NPS site]?

RQ6, ERQ1-3

This information is important to understand the overall trip characteristics of park visitors and helps estimate and characterize overall visitation to the unit. The research team will work with participating units to identify their local area using staff knowledge and visitor use statistics where available.

Q8: How many days will you spend in the local area on your trip?

RQ6, ERQ1-3

Q9: On your trip, will you stay overnight away from your permanent residence in [NPS Site] and/or within the local area?

RQ6, ERQ1-3


Accommodation usage assists park managers in identifying the most popular overnight accommodation types among park visitors and where those accommodations are located. Furthermore, the number of nights assists in calculating an average length of stay for overnight visitor, which is a component of calculating the contribution of visitor spending to the local region.

Q10: On your trip, what type(s) of accommodations will you use while in the local area, including any nights spent in [NPS Site]?

RQ6, ERQ1-3

Q11: What time did you arrive in [NPS Site] on your first day of your stay?

RQ1, 5-6

Used to assess the time of day the respondent arrived and departed the site – if they spent nights in NPS unit accommodations. This information is used to estimate the length of stay of park visitors. Monitoring visitors’ length of stay at park units assists in estimating overall recreation visits and helps managers gauge how best to provide necessary services based on visitors’ typical stay visiting the unit.

Q12: What time do you plan to leave [NPS Site] on the last day of your stay?

RQ1, 5-6

Section 3

Trip characteristics, including group size, time, number of days, mode of transportation, and prior visits, will be used to build a park-level visitor profile.

Q13: Including yourself, how many people are in your personal group as you visit [NPS Site]?

RQ1-2

Visitor group size and member ages are used to estimate overall visitation to a park and helps managers plan for infrastructure needs based on travel party dynamics. The definition of the “personal group” helps differentiate between the travel party and personal group.

While the NPS often presents age results in ranges, age is usually collected as a discreet number since ages can be converted to ranges but the opposite is not true (similar to the Census approach for collecting age data). Collecting discreet ages makes for easier/more intuitive data organization since it is difficult to break age ranges into equal categories of 5 to 10 years, especially with the split of children and adults at 18. Also, more nuance results are needed in order to see gradual changes in age of visitors since many age range examples are for 15+ years in some age range categories. Further, age data may inform programming/requirements that change over time, such as program specific age limits, retirement age, driving age, etc.

Q14: What are the ages of each of the adults in your group?

RQ1-2

Q15: How many children in your group were in the following age ranges?

RQ1-2

Q16: Including yourself, how many people in your personal group split the trip expenses?

RQ6, ERQ1-3


Q17: As you know, some of the costs of travel such as gasoline, hotels, rental cars, and airline tickets can/often increase. If your share of the total trip costs were $X more, would you still have taken this trip to [NPS site]? Please mark (●) one.


RQ6, ERQ1-3

Willingness to pay questions are used to determine the overall valuation respondents place on their visit to an NPS unit. No other question contained in this survey can effectively achieve this level of economic information as this. The information collected in this question will provide the NPS with a more complete understanding of the value park visitors place on their trips. Furthermore, it will provide NPS managers with vital information leading to a demonstration of return on investment in preserving and maintaining various park units. The question format used (dichotomous choice) is a familiar construct that imposes little in the way of cognitive burden on respondents.


For additional justification information, please see section below justification table for further explanation.

Q18: On this trip, do you plan to visit [NPS Site] for more than one day?

RQ1, 5-6


Used to estimate NPS unit visitation and the effect of visitor spending. Time spent within the unit and reentry by individual groups is important for calibrating visitation estimates. Because many park units estimate overall visitation by entrance gate visits, it is important to identify how many times, on average, visitors enter and reenter the park unit on a single given trip.

Q19: Did you, or do you plan to leave and re-enter the park today?

RQ1, 5-6

Q20: Which was the most recent entrance you used to enter the park?

RQ1, 5-6

In order to understand visitor flow through the NPS unit and trip characteristics of visitors, it is necessary to gauge the use of different points of entry. Managers can use entrance proportions to explore staffing needs at busy entrances and infrastructure support based on visitor flow by entrance. This question will not be applicable for parks with only one entrance.

Q21: Considering your visit today, have you been to, or do you plan to visit any of the following locations within [NPS Site]?

RQ3-4

This information will help NPS managers determine the most popular locations in the park, which will be used for several purposes to improve visitor experience, including staffing, programming, and infrastructure development/investment. Each unit’s list will be determined using insights from NPS unit staff.

Q22: From the list below, select all forms of transportation you personally used to travel from your home to [NPS Site] on this trip.
RQ1, 5-6

The purpose of these questions is to assist in estimating visitor use and access into the park. Park managers can use this information to plan for infrastructure improvements based on the most common modes of transportation.

Q23: Which form of transportation did you personally use to enter [NPS Site] today?
RQ1, 5-6

Q24: Was your visit to [NPS site] part of a multi-day group tour organized by a travel agency or other tour operator (i.e., a packaged tour)?

RQ1, 5-6


Obtaining the total cost of the package tour and the number of people the package cost covers enables a more accurate understanding of this typically challenging segment of users. In past socioeconomic research, estimating commercial package usage and what is contained in a package tour has been difficult. This information will help the NPS, and individual park managers identify how frequently commercial tours are entering the park and gain a better estimate of the visitor spending effects included in packaged tour spending.

Q25: On this visit to [NPS Site], which one of the following entrance fees applied to you personally?

RQ6, ERQ4

Used to help understand the predominant types of passes visitors use to access the park. Monitoring pass usage can help managers understand if specific passes are more commonly used than others and whether patterns change over time.

A total expenditures question (Q3) is included on the mail-back survey to capture category-level spending for economic analysis. A total without categories does not provide useful data for analysis.


Table 2.2: Mail-Back/Online Survey Question Justifications

Mail-Back/Online Survey

Section 1

Serves as an introduction to the survey.

Q1: Prior to this trip, how did you obtain information about [NPS Site]?

RQ1, 5

The responses to these questions gauge visitors’ use of different information sources and identify any topics where information either is missing or hard to find. This information is directly relevant to the planning experience of the public and can be used by managers to improve trip planning resources and subsequent, onsite visitor experiences.

Q2: Did you have the information about [NPS Site] you needed on this trip?

RQ1, 3, 5

Section 2

The questions in this section relate to visitor expenditures and are used to calculate the economic contribution visitors have within the park, in surrounding communities within the region, and elsewhere in the U.S. Programs across the agency will use this information to estimate parks economic contributions to the local economy.

Q3: Please estimate how much money the group you shared expenses with (or covered expenses for) spent in [NPS site] and [NPS site]’s local area during your trip.

RQ6, ERQ1-3

Questions #3-6 are used to calculate the spending per person per day in a variety of spending categories. Question #3 specifically asks the visitor to estimate their total trip spending both at the NPS unit and the surrounding local area, while Question #4 specifically asks the visitor to estimate their total trip spending on their entire trip. This information will help understand how visitor spending contributes to the local economies, as well as the regional economies of areas near park units. The NPS can use this data to further refine visitor spending effects metrics published on an annual basis.


Note: The NPS is analyzing whether we should move the category-level spending question from the mail-back to the intercept survey. Such a shift would constitute a significant change to our visitor use expenditures methodology that we are reluctant to make before we finish a full sensitivity analysis. This analysis should be completed by the end of 2025. We need to consider the tradeoffs between (1) improved sample size vs. (2) asking respondents to predict their full trip spending while still on their trip (although we acknowledge there may also be concerns with asking respondents to recall their full trip spending after the trip is over).

Q4: In total, how much did you and the group you shared expenses with (or covered expenses for) spend on this entire trip, from the time you left home until you returned home?

RQ6, ERQ1-3

Q5: How many people (adults and children), including yourself, were covered by the spending estimates you provided above?

RQ6, ERQ1-3

Q6: If you did not list any expenditures in the local area, please indicate why you did not answer.

RQ6, ERQ1-3

Used to determine the reasons why a participant may have failed to respond to the spending questions above. In addition, the response to this question, in conjunction with other demographic measures, can help understand any potential source of missing data. This question will allow NPS economists to refine spending estimates when respondents fail to record any information.

Section 3

The purpose of this section is to explore visitation trends, motivations to visit, and in-park activities at a park-specific level. Individual unit managers need information on visitor activities in the park to help plan for infrastructure improvements, new activity additions, and trends over time.

Q7: Did you visit any other National Park Service sites on your trip away from home?
RQ1-2, 5

Data from this question will identify visitation patterns and allow managers to collaborate with other units to best serve visitors and identify opportunities for sharing information and resources

Q8: On this trip, in which of the following activities did you personally participate in while visiting [NPS Site]?

RQ1, 3-5

This information will help NPS managers understand the most popular activities to improve the visitor experience and to ensure visitors are aware of all available activities. Below is an example list. Each unit will have their list catered to their site, using insights from NPS unit staff.

Q9: Of the activities listed in Question 8, which was your primary activity while visiting [NPS Site] during the trip you were contracted for this survey?

RQ1, 3-5

Q10: People are motivated to visit a National Park Service site for a variety of reasons. How important to you were each of the following reasons or motivations for visiting [NPS Site] on this trip?

RQ1, 3-5

Managers can use these importance ratings to provide experiences that match up with what the various visitor segments are seeking. Further, current programming and educational opportunities can incorporate these motivations for a more custom experience at the park.

Q11: Of the reasons listed in Q10, which was the most important reason for you to visit [NPS site] on this trip?
RQ1, 3-5

Section 4

At a national level, there has yet to be a complete understanding of program usage, technology preferences, and NPS unit experiences. This section will assist the NPS in identifying whether visitors are participating in programs, technology usage and quality of service, and what types of information visitors perceive as valuable from their park visit.

Q12: On this trip, in which of the following programs and services did you personally participate within [NPS Site]?

A core user group of the SEM data and associated reports are NPS staff leading the various national-level programs. Better understanding of where programs and services are used and by whom they are used or not used provides valuable insight into program planning. At the park level, this information is readily used by interpretation and education staff as well as park leadership in determining which programs and services visitors most commonly use. When combined with questions of written feedback, such as the final question, and questions about rating facilities and services, further vital information is produced to inform resource management and needed improvements or areas of success.

Q13: If you were to visit [NPS Site] in the future, are there specific subjects you would like to learn about?

RQ3-4

NPS units rely on visitor feedback to improve their programs and to understand if learning objectives are being met through their current curriculum and approach. During pilot phases, responses to this question have been used to inform interpretation and education staff. This survey also replaces visitor survey cards, and while most open-ended response questions have been removed, maintaining some of the functionality of the visitor survey cards maintains the ability of park staff to assess interpretation and education needs.

Q14: Did you use a personal electronic device while in [NPS Site] to do any of the following actions?

RQ3

These questions will help assess the quality of current services and determine visitor opinions about the importance or need of such services. This is an important topic for the NPS to understand as it progresses further into the next century of management. NPS managers can use this information to understand if new technological connectivity needs to be added to their park.

Q15: How important to you was it during your visit to [NPS Site] to use personal electronic devices to do each of the following, and how would you rate the quality of the service to do each?

RQ3

Q16: Did anyone in your personal group have difficulties accessing or participating in park activities or services, during your visit to [NPS Site]?

RQ1-5

Providing adequate access to people living with disabilities or challenges is a recurring management issue at NPS units. NPS managers can use this data to monitor whether further accessibility measures are needed to meet the needs of the public.

Q17: What did you like most about your visit to [NPS Site]?

RQ3-4, ERQ4

The SEM survey replaces the NPS visitor survey cards that were historically used to obtain customer feedback. NPS units rely on visitor feedback to improve their programs. This information can be used to inform decisions about the site’s management, including identifying areas for improvement or unforeseen issues, as well as highpoints that should be maintained, expanded, or duplicated in other areas.

Q18: What did you like least about your visit to [NPS Site]?

RQ3-4, ERQ4

Q19: To what extent do you agree or disagree with each of the following statements?

RQ3-4, ERQ4

These questions provide park managers with timely data to evaluate their management objectives and identify priority areas for improvement.

Q20: How would you rate the quality of the park facilities, visitor services, and recreational opportunities in [NPS Site]?

RQ3-4, ERQ4

Q21: Overall, how would you rate the quality of the park facilities, visitor services, and recreational opportunities in [NPS Site]?

RQ3-4, ERQ4

Section 5 (demographic questions Q30-38)

The questions in this section are designed to characterize the population of respondents in the sample. This information will be used to understand demographic characteristics of the park user community.

Q22-29:

RQ1-2, 4-5

These are demographic questions. See section 5 justification above. Specifically, zip code (Q23) is being asked on both the intercept and mail-back surveys as a means to check for errors in matching the two surveys and because of the importance of the question. Also, questions contextualizing household income (Q26-28) are being asked in order to estimate per-person income, which is needed for accurate estimations of travel cost models and associated consumer surplus estimates.

Q30: Is there anything else you would like to tell us about [NPS Site]’s facilities, services, or recreational opportunities?

All, depending on response content

This allows park unit managers to identify additional topics relevant to their visitors that may warrant future attention.


2(h): Willingness-To-Pay Question Additional Information
Question 17 on the intercept survey is a standard WTP question used to value recreation opportunities in national parks and other public lands. NPS consulted with several experts in the field of nonmarket valuation to develop this question, including John Loomis and Chris Neher. This question format has been used and tested in numerous applications (e.g., Duffield et al., 2011; Huber and Sexton, 2019; Keske et al., 2013; Loomis and Keske, 2012; Richardson and Flyr, 2024; Sinclair et al., 2020;). The nonmarket good being valued is a recreation trip, and this type of question is often referred to as a “current trip” contingent valuation question (paying increased costs to take the trip, or not taking the trip). A question that values the current trip is a more straightforward formulation of one that values the current trip with a policy/experience change (e.g., many previous studies ask a “current trip” WTP question, followed by a question with a policy/quality change, such as increased water flows or fish catch). This type of question is incentive compatible since there is no reason to strategize when a government policy is not involved. The proposed payment vehicle is an increase in trip costs because trip costs tend to be a more realistic and less controversial payment vehicle compared to park entrance fees (entrance fees may also elicit responses having more to do with a visitor’s perception of a reasonable entry fee rather than measuring true WTP). OMB has approved several of these questions in the USFWS national wildlife refuge visitor surveys6 and Colorado River surveys for whitewater users and anglers.7 The results of this question will be used by the NPS to estimate a per trip recreation value to a national park (see Sinclair et al., 2020; Richardson and Flyr, 2024 for examples). Per-trip recreation values are increasingly needed by the NPS for regulatory analyses and natural resource damage assessments. 

Two sets of bid amounts were developed for this WTP question - one characterized by slightly lower dollar amounts ranging from $5 to $750 to be used for parks that have primarily day use, and another with slightly higher dollar amounts ranging from $5 to $2,000 to be used for parks with overnight stays. The full list of bid amounts for each version (A and B) are included in the grayed annotation box preceding Question 17 on the intercept survey. These bid amounts were developed based on reviews of the existing recreation valuation literature, specifically CV questions with a similar question format, as well as input from experts in the field of nonmarket valuation. The bid amounts were then tested in three pilot surveys.8 The results of these pilot surveys indicate sensitivity to the bid amounts presented, and issues such as fat tails were not found to be problematic (e.g., only 5-8% of respondents chose the highest bid amount, indicating that the right tail of the distribution was captured relatively well). Published journal articles from this pilot work include Richardson and Flyr (2024) and Sinclair et al. (2020). 

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.

As highlighted in Question 2, data from park visitors is collected first through an intercept survey via a tablet with visitors at the park. They are then handed a mail-back survey with the option of completing it online or by paper and mailing back to us. Using a tablet and online software allows for a higher degree of accuracy among respondents. The intercept survey will be administered using an electronic tablet that does not require a Wi-Fi or cellular connection. Surveyors will use the tablet to ask visitors questions on-site and record their responses on the tablet through survey software. The survey software platform, Qualtrics, will be used to save responses automatically and uploaded at the end of each study day.

For the mail-back portion of the survey, we anticipate 20% of respondents will utilize the electronic option and 80% will fill out the paper survey. The online version of the mail-back survey will also be administered using Qualtrics. Visitors will be given a customized URL based on the NPS unit’s official acronym to access the online version of the mail-back survey. The directions to participate are detailed in the cover letter given to the respondents, after completing the intercept survey on-site. Respondents will be asked to use the 4-digit ID code on their survey packet which will be linked back to their on-site responses. The online survey is Section 508 compliant. This methodology strives to reduce the burden on the respondent by capturing some key data during the intercept process and allows flexibility for responding to the mail-back survey by providing two options to participate.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

There is no known duplication of efforts. While individual park service units occasionally conduct socioeconomic surveys, the currently available data is insufficient for generalizing findings across all NPS units in the System. Further, individual parks may conduct separate, specialized visitor research; however, such studies would not be compatible for comparison to the SEM results. A review of previous socioeconomic based studies reveals slight to moderate variation across survey efforts in how the surveys were conducted by the researchers, how the questions were asked, and how the data was treated in reporting. Thus, previous or ongoing socioeconomic research studies do not allow for comparisons across units and cannot be aggregated to represent the overall system as a whole, making SEM necessary Service-wide.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

This collection will have no impact on small businesses or other small entities. The survey will only target members of the public who are visiting national parks.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

NPS policy mandates that social science research be used to provide an understanding of park visitors, the non-visiting public, and gateway communities and regions. Without SEM, the NPS would be unable to meet this mandate and monitoring and comparing socioeconomic trends across the agency would remain impossible. Further, not collecting rigorous, generalizable data relative to visitation would result in the NPS being unable to track trends regarding visitor demographics and subsequently not meeting basic levels of understanding on visitors to ensure national parks remain relevant to taxpayers. Additionally, there is not another feasible study plan that would allow for data collection at the scale and rigor required to make generalizations about park visitors as a whole.

Typically, only larger park units have been able to financially afford individual survey efforts. The SEM includes parks of all sizes and types, including many parks that could not afford to conduct their own study. Consequentially, without this information, these smaller, less financially capable parks would continue to rely on outdated data or anecdotal information and assumptions, which would negatively affect their ability to make well-informed planning and management decisions. These reasons along with the highlighted agency-wide needs are why the SEM study is a critical part to the future of visitor management to the NPS.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

    • requiring respondents to report information to the agency more often than quarterly;

    • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

    • requiring respondents to submit more than an original and two copies of any document; requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

    • in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

    • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

    • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

    • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

There are no special circumstances as part of this collection.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


On August 27, 2021, a 60-Day Federal Register Notice (86 FR 48244) was published announcing this information collection request. Public comments were solicited for 60 days, ending on October 26, 2021. No comments were received.

A 30-day FRN (88 FR 44831) was previously published on 7/13/2023. A subsequent 30-day FRN will be published as a result of discussions with the OIRA desk officers about (1) the overall intent of the collections (2) methodology and statistical sampling, and (3) the length of survey instruments. The following updates have been made to the collection: (1) reductions to the overall number of questions (2) added methods for increasing response rates, and (3) slight adjustments to the wording of questions for clarity.

The survey questions were tested through a 2015-2016 Phase I Pilot Study9. The survey was tested in 14 parks across the System, refined based on findings, and reviewed with multiple subject matter experts. Questions were added or altered based on the refinements and then reviewed by three subject matter experts (Table 1). However, the study plan and associated approach to collecting data were proven as accurate and reliable. The refined questions and methods were then applied across 24 parks in 2022 during the Phase II Pilot Study (Socioeconomic Pilot Survey, Phase II; OMB Control #1024-0224; ex. 5/31/2023). The two pilot phases have allowed the NPS to (1) validate the survey questions, (2) investigate various sampling methods, (3) estimate the respondent burden and response rates, and (4) determine the usability of the survey design. These pilot studies have been successful and further helped address issues of potential non-response bias among respondents and overall representativeness of the data at an aggregate level.

To inform Phase II of the Pilot Study, the three subject matter experts in outdoor recreation and tourism research (listed in Table 8.1) were provided with methodology, results, and question performance analysis from the original pilot and the proposed enhancements to the intercept survey, mail-back/online survey, and overall study. We asked them to provide feedback on the overall methodology and data elements, survey design, validity of question content, clarity of instructions, and possible analytical needs. Comments were provided via track changes with additional consultation via telephone.

Additionally, the final proposed survey instruments (both intercept and mail back) were tested on nine members of the public of various ages (18 or older) and backgrounds to inform burden estimates and overall clarity of final questions and instructions. All comments and feedback were considered and addressed throughout the refinement of the Phase II Pilot Study.

Table 8.1. Peer Reviewers

Name

Affiliation

1: Zach Miller (Assistant Professor)

Utah State University

2: Bynum Boley (Associate Professor)

University of Georgia

3: Will Rice (Assistant Professor)

University of Montana


“Whether or not the collection of information is necessary, including whether or not the information will have practical utility; whether there are any questions they felt were unnecessary.”

Overall Comments: Reviewers were generally positive in terms of the utility and need for SEM, as well as the overall high-quality of the study methodology and data elements, survey design, validity of question content, and clarity of instructions.


Comment #2: While acknowledging the utility of the payment card question (# 7), a reviewer commented on the possibility of having an open-ended response as opposed to discrete response options.


Pilot Testing: Results from the pilot testing revealed that the employed sampling methods are transferrable between NPS units and that the survey instruments provide the intended information, including socioeconomic trends and visitor attitudes, behaviors, and preferences. Furthermore, the results were found to be broadly generalizable.


NPS Response: The NPS appreciated the overall positive feedback. We responded to the feedback regarding question 7 with an explanation of the need to maintain response options in order to satisfy the economic standards of the question and ensure question validity.


What is your estimate of the amount of time it takes to complete each form in order to verify the accuracy of our estimate of the burden for this collection of information?”

Pre-Testing: Phase II Pilot Testing and the more recent pre-test with nine individuals has provided the NPS with a detailed and accurate burden estimate for all portions of this submission, including the intercept survey, non-response bias questions, and the mail-back/online survey. The estimated average burden for the intercept survey is 6 minutes. The estimated average burden for the mail-back survey is 13 minutes.


Do you have any suggestions for us on ways to enhance the quality, utility, and clarity of the information to be collected?”

Comment #1: Provided feedback on the response options for question 11 on the mail-back survey regarding motivations for individuals visiting NPS units.


Comment #2: Asserted the need to maintain directionality for Likert Scales.


NPS Response: Based on feedback, we reviewed and refined the response options for question 11 on the mail-back survey and ensured directionality for Likert Scales throughout the survey instruments.


Ways to minimize the burden of the collection of information on respondents”

Comment #1: Identified a need to ensure standardized training of surveyors across all units and all years, as well as continued refinement of intercept methods to maintain burden estimates.

NPS Response: The NPS has ensured standardized, comprehensive, and consistent training for fielding SEM across all units now and into the future. Based on experience during both phases of pilot testing, the NPS has refined intercept methods to minimize burden, including providing high-quality surveyor training and refining the skip logic and survey flow for the intercept survey to allow for quick and effective delivery.

9. Explain any decisions to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.

The Social Science Program (SSP), as with many survey research methods across federal land management applications, has observed lower response rates than expected in the past few years of collecting visitor surveys, and this response rate continues to decrease. Therefore, the SSP finds it in the best interest of the SEM (and the wider NPS that uses data obtained through the SEM) to test the application of a de minimus incentive in a subset of the parks selected for this year.

The incentive will be small in size and value and available to those approached whether or not they complete the survey, as required and based on survey ethics. Options include a NPS sticker, button, or similar. This will be piloted at three parks in the coming round of SEM surveys. In each of the parks, in order to maintain a scientific approach and controlling variables, only a portion of the respondents (based on day of survey) will receive incentives. Then, the vendor will compare the response rates of the different subgroups of surveyors. The results of this pilot will inform potential future use of incentives or pursuit of other options to improve response rates in a changing world.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Anonymity/confidentiality of responses will be described to respondents in the initial on-site intercept contact and reiterated in written literature accompanying the paper and online versions of the mail-back survey. The evaluation and statistical analysis of collected information will be conducted independently of personally identifiable information, and respondents’ names will never be connected to their reported responses. Personally identifiable information will only be accessible to the study team, except as required by law.

The only personally identifiable information collected from visitors will be home mailing addresses and/or email addresses for the sole purpose of administering reminders for the mail-back survey. This information will only be accessible and used by the study team for the purposes described in this study. Each mail-back survey will be assigned a unique identifier to tie it to the respondent’s intercept survey. This identifier will safeguard the respondent’s anonymity and allow researchers to both join data upon mail-back survey completion, as well as monitor mail-back non-response bias among respondents who do not send the mail-back survey back. The unique identifier will be used in all databases; only one password-protected list will link the unique identifier to respondents’ mailing and/or email addresses.

This process will be described to respondents by way of the informed consent page during the on-site survey. At the end of the data analysis period, all personally identifiable information will be destroyed.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

No questions of a sensitive nature will be asked as part of this collection.

12. Provide estimates of the hour burden of the collection of information. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices. * If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

  • Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under item 13.

Based on the Phase I Pilot Study, plus further refinements to the methodology, outreach to subject matter experts, the Phase II pilot results, we expect to receive a total of 58,544 annual responses. This includes the intercept survey (n=36,936), non-response survey (n=8,310), and mail-back survey (n=13,298). The total annual burden for this collection is estimated to be 6,852 hours (Table 12.1).

1. Intercept Survey: For the intercept survey, we will randomly approach approximately 46,170 visitors while on-site at 30 parks per year. Based on Phase II piloting using these exact methods, we estimate that 80% (n=36,936 total) of those approached will participate in the 6-minute intercept survey (including the initial contact time and survey completion time). Thus, the annual burden for the on-site survey is estimated to be 3,694 hours.

2. Non-Response Check & Observations: Two potential sources of non-response bias exist in this study: 1) those who choose to participate in the intercept study compared to those who refuse and 2) those who participated in the intercept survey and complete the mail-back survey compared to those who participated in the intercept survey but refuse/do not send the mail-back survey back. For each park’s sample, the research team will monitor response rates for both scenarios to gauge whether non-response bias needs to be tested. If the response rate of either the intercept survey or the mail-back survey falls below 80%, a non-response bias test will be conducted on the appropriate group. For instance, if less than 80% of visitors accept the intercept survey (ex. choose to participate), we will test the responses between those who participated in the intercept survey and those who answered the four non-response bias questions below. Responses and observations will be recorded and compared to final respondent data to investigate for any possible nonresponse bias.

The four non-response bias questions asked of respondents are below:

1. “Are you a permanent or seasonal/second home resident of the local area around [NPS Site]?” 

2. “Do you currently live in the United States?” 

3. “On this trip away from home, have you [and your personal group] stayed, or will you stay overnight away from your permanent residence either in [NPS Site] and/or within the local area? [Show map]”

4. Was visiting [NPS Site] the primary purpose for your overall trip away from home? 

Based on Phase II pilot testing, we anticipate that of all the visitors contacted, 9,234 will be non-respondents. We expect that 90% (n=8,310) of non-respondents will agree to answer the non-response bias questions and 10% (n=924) will outright refuse to participate. We expect each non-response bias contact to take 2 minutes (rounded up) to complete, inclusive of the introduction to the survey purpose and invitation to participate. Thus, the total annual non-response bias burden will be 277 hours.

3. Hard Refusals: Of the 9,234 visitors contacted who refuse to respond to the on-site survey, we expect 10% (n=924) to be hard refusals. The burden for hard refusals (visitors completely refusing to participate in the collection) will not be estimated due to the de minimis nature of their participation.

4. Mail-back/Online Visitor Survey: Of the 36,936 visitors who complete the intercept survey, we expected 90% to accept the mail-back survey (n=33,243). Based on the Phase I and II pilot studies, we anticipate a 40% response rate across all park units (n=13,298) for those completing the mail-back/online survey. We estimate the mail-back survey will take 13 minutes to complete, resulting in a total annual burden of 2,881 hours.

Table 12.1. Estimated annual respondent burden

Activity

No. of Completed Responses

Completion Time

(minutes)

Annual Burden*
(hours)

Intercept Survey (includes initial contact)

36,936

6

3,694

Mail-back/Online survey

13,298

13

2,881

Non-response bias check (includes initial contact)

8,310

2

277

Totals

58,544


6,852

* The calculations in this table are rounded to the nearest whole number.

We estimate the total dollar value of the annual burden hours to be $316,151 (Table 12.2). The estimated dollar value of the burden hours for this collection considers the nature of our respondents which include individuals or households. This estimated dollar value included the multiplier for benefits and is based on the National Compensation Survey: Occupational Wages in the United States published by the Bureau of Labor Statistics Occupation and Wages, BLS news release USDL-22-198210 for Employer Costs for Employee Compensation—March 2024 Released June 18, 2024. The particular value utilized was $46.14/hour for individual households.

Table 12.2. Estimated Dollar Value of Respondent Annual Burden Hours

Activity

Estimated # of Respondents

Burden
(hours)

Value of Burden Hours
(including benefits)

Annual Value of Burden Hours*

On-site visitor survey

36,936

3,694

$46.14

$170,441

Non-response bias check

8,310

277

$46.14

$12,781

Mail-back/Online survey

13,296

2,2881

$46.14

$132,929

Total

$316,151

*The calculations in this table are rounded up to the nearest whole dollar amount

13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information.

There is no non-hour cost burden associated with this collection of information.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.

The total annual cost to the Federal Government is $2,543,686. This includes the cost to the Federal Government for salaries and benefits for administering this information collection (Table 14.1) and operational expenses (Table 14.2).

We used the Office of Personnel Management Salary Table 2024-DEN11 to determine the hourly rate. We multiplied the hourly rate by 1.59 to account for benefits in accordance with the Bureau of Labor Statistics News Release referenced above.

Table 14.1: Annualized Federal Employee Salaries and Benefits

Position

Grade/
Step

Hourly Rate

Hourly Rate incl. benefits (1.59 x hourly pay rate)

Estimated time per task (hours)

Annual Cost

Project Manager

12/5

$52.50

$83.48

80

$6,678

Project Advisor

12/5

$52.50

$83.48

40

$3,339

Project Advisor

12/5

$52.50

$83.48

20

$1,670

Totals

140

$11,687


Table 14.2: Operational Expenses

Operational Expenses

Estimated Cost

Task A: Overall Project Management and Coordination; Scheduling; Monthly Reporting

$75,658

Task 1.0: Study Design and Methods Development

  • Organize and schedule project scoping calls

  • Coordination, testing, and refinements for Year 2. New full ICR package for Year 3.

  • Develop Data collection and management plan (Overall for all parks)

$114,980

Task 2.0: Survey Administration, Data collection, and Data Entry

  • 24-park specific logistics and collection packages with updated survey instruments; three meetings each park.

  • Survey administration in the field and after field memorandums.

  • Labor costs for field teams

  • Cleaned and certified data preparation

$1,071,383

Task 3.0: Analysis and Reporting (reports for all parks and national report)

$367,752

Task 4.0: Communication materials

$19,235

ODC: Direct Costs and Expenses

$882,991

TOTAL COST

$2,531,999


15. Explain the reasons for any program changes or adjustments reported.

This is a new collection.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Data analysis and reporting will be developed and conducted at the individual park level for each of the 30 units per year. These reports will be published in accordance with the Natural Resources Report Series template publishing standards and be inclusive of descriptive statistics, frequencies, percentages, and averages of appropriate questions. Questions regarding spending will not be included in these reports; rather, they will be analyzed independently by NPS economists for inclusion in the annual NPS Visitor Spending Effects (VSE) report12. Analyses of Variance (ANOVAs), t-Tests, and chi-square tests will be used to test for differences between groups and non-response checks as appropriate and as interest exists by a surveyed park or other scales of management. For example, some parks may be interested in differences between first-time and repeat visitors where appropriate inferential statistics can be of use.

This project is slated to occur every year, beginning with data collection in March 2024, and will occur for a minimum of three years. Every year, the principal investigators will prepare a final report for the NPS that summarizes regional and national results, and individual reports to each participating park unit. Knowing that many reports will be produced, we expect reporting and analysis to be ongoing while preparing for the next season of sampling. It is anticipated that by March of the following year, individual park reports for the year prior will be finalized and regional and national reports will be finalized by July of the following year.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

We will display the OMB control number and expiration date on the information collection instruments.

18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”

There are no exceptions to the certification statement.









Reference List

Carver, E., and Caudill, J., 2013. Banking on nature: The economic benefits to local communities of National Wildlife Refuge visitation. US Fish and Wildlife Service.


Chang, W. H., Propst, D. B., Stynes, D. J., and Jackson, R. S. 2003. Recreation visitor spending profiles and economic benefit to Corps of Engineers projects.


Duffield J, Neher C, Patterson D, 2011. Economic valuation of National Park System visitation. Meta-analysis of park value and application system-wide. Prepared for: National Park Service. Social Sciences Division. University of Montana, Department of Mathematical Sciences. 


Hang D, McFadden D, Train K., Wise K. 2016. Is Vehicle Depreciation a Component of Marginal Travel Cost? A Literature Review and Empirical Analysis. Journal of Transport Economics and Policy, Volume 50.


Heberling MT, Templeton J.J,. 2009. Estimating the economic value of national parks with count data Models using on-site, secondary data: The case of the Great Sand Dunes National Park and Preserve. Environmental Management 43: 619-627.


Huhtala, M., Kajala, L., & Vatanen, E., 2010. Local economic impacts of national park visitors’ spending: The development process of an estimation method (Working Papers of the Finnish Forest Research Institute 149).


Huber, C. and Sexton, N., 2019. Value of migratory bird recreation at the Bosque del Apache National Wildlife Refuge in New Mexico. In Western Economics Forum (Vol. 17, No. 2, pp. 52-63). 


Jorgenson, J., Nickerson, N., Dalenberg, D., Angle, J., Metcalf, E., & Freimund, W., 2018. Measuring visitor experiences: Creating and testing the tourism autobiographical memory scale. Journal of Travel Research, 58(1), 566-578.


Kaval P. 2007. Recreation benefits of U.S. parks. Working Paper in Economics 12/07. Hamilton, New Zealand: University of Waikato.


Keske, C.M., Lohman, G. and Loomis, J.B., 2013. Do respondents report willingness-to-pay on a per person or per group basis? A high mountain recreation example. Tourism Economics, 19(1), pp.133-145. 


Koontz, L., Thomas, C. C., Ziesler, P., Olson, J., & Meldrum, B. 2017. Visitor spending effects: assessing and showcasing America’s investment in national parks. Journal of Sustainable Tourism, 25(12), 1865-1876.


Loomis, J. and Keske, C., 2012. Did the Great Recession Reduce Visitor Spending and Willingness to Pay or Nature‐Based Recreation? Evidence from 2006 and 2009. Contemporary Economic Policy, 30(2), pp.238-246. 


Mayer, M., Müller, M., Woltering, M., Arnegger, J., and Job, H. 2010. The economic impact of tourism in six German national parks. Landscape and urban planning, 97(2), 73-82.


Richardson, L. and Flyr, M., 2024. Testing the convergent validity of approaches for valuing national park visitation. Environmental Economics and Policy Studies, 26(1), pp.101-120. 


Sinclair, W., Huber, C. and Richardson, L., 2020. Valuing tourism to a historic World War II national memorial. Journal of Cultural Heritage, 45, pp.334-338. 


Souza, T. D. V. S. B., 2016. Recreation classification, tourism demand and economic impact analyses of the federal protected areas of Brazil (Doctoral dissertation, University of Florida).



Souza, T. D. V. S. B. (2016). Recreation classification, tourism demand and economic impact analyses of the federal protected areas of Brazil (Doctoral dissertation, University of Florida).



White, Eric M., 2017. Spending patterns of outdoor recreation visitors to national forests. Gen. Tech. Rep. PNW-GTR-961. Portland, OR: U.S. Department of Agriculture, Forest Service, Pacific Northwest Research Station. 70 p.


Wilton, James J. and Nickerson, N.P., 2006. Collecting and Using Visitor Spending Data. Journal of Travel Research, 45(1), 17-25.



1 NPS Socioeconomic Monitoring Program (SEM 1.0); OMB Control #: 1024-0224; Exp. 5/31/2020

2 Socioeconomic Pilot Survey, Phase II; OMB Control #1024-0224; ex. 5/31/2023

3 NPS Socioeconomic Monitoring Program (SEM 1.0); OMB Control #: 1024-0224; Exp. 5/31/2020

Socioeconomic Pilot Survey, Phase II; OMB Control #1024-0224; ex. 5/31/2023

4 OMB Control Number 1024-0224, Expiration Date: 5-30-2019, Programmatic Clearance for NPS-Sponsored Public Surveys

5 OMB Control Number 0596-0236, Expiration Date: 3/31/2024, Interagency Generic Clearance for Federal Land Management Agencies Collaborative Visitor Feedback Surveys on Recreation and Transportation Related Programs and Systems

6 OMB Control Number 0596-0236, Expiration Date: 3/31/2024, Interagency Generic Clearance for Federal Land Management Agencies Collaborative Visitor Feedback Surveys on Recreation and Transportation Related Programs and Systems

7 OMB Control Number 1024-0272, Expiration Date: 8/31/2018, Survey of Direct Recreational Uses Along The Colorado River (see Angler survey Q17; Whitewater survey Q28).

8 OMB Control Number 1024-0224, Expiration Date: 5-30-2019, Programmatic Clearance for NPS-Sponsored Public Surveys

9 NPS Socioeconomic Monitoring Program (SEM 1.0); OMB Control #: 1024-0224; Exp. 5/31/2020

10 https://www.bls.gov/news.release/pdf/ecec.pdf

11 https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2024/DEN_h.pdf

12 https://www.nps.gov/nature/customcf/NPS_Data_Visualization/docs/NPS_2021_Visitor_Spending_Effects.pdf

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPonds, Phadrea D.
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy