Supporting_Statement__Part_A_4-15-2016

Supporting_Statement__Part_A_4-15-2016.doc

Programmatic Clearance Process for NPS-Sponsored Public Surveys

OMB: 1024-0224

Document [doc]
Download: doc | pdf


Supporting Statement A

Programmatic Clearance

for NPS-Sponsored Public Surveys

OMB Control Number 1024-0224


Terms of clearance: NONE


General Instructions


A completed Supporting Statement A must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified below. If an item is not applicable, provide a brief explanation. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," then a Supporting Statement B must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.


Specific Instructions


Introduction:


The National Park Service (NPS) is requesting a three-year extension of its Programmatic Clearance for NPS-Sponsored Public Surveys, originally approved by the Office of Management and Budget (OMB) in August 1998. The Programmatic Clearance enables NPS to conduct research through external standard social science research methods (e.g., questionnaires, focus groups, interviews, etc.). This information has historically been collected to inform and improve the services and products that NPS provides to the public and thus better carry out part of its statutory mission.


As part of its Social Science Program, the NPS sponsors surveys of the public to provide park managers with information for improving the quality and utility of NPS programs. Many of the NPS surveys are similar in terms of the populations being surveyed, the types of questions being asked, and the research methods used. The existing Programmatic Clearance presents an alternative approach to complying with the Paperwork Reduction Act of 1995 (PRA). Under the proposed Programmatic Clearance renewal, NPS will continue to pursue an accurate understanding of the relationship between people and parks is critical to achieving the mission of the National Park System to protect resources unimpaired and providing for public enjoyment, education, and inspiration. Such understanding requires a sound scientific basis. Hence, social science research is a necessary and important function of the agency.


The NPS Social Science Program will continue to conduct the necessary quality control, including assuring that each survey instrument comports with the guidelines of the Programmatic Clearance, and will continue to submit each particular information collection request for expedited review to OMB as we are ready to deploy a specific information collection.


Each collection under this generic clearance must be well defined in terms of its sample or respondent pool and research methodology, and each individual collection should clearly fit within the overall plan and scope of the currently approved Information Collection Request. Individual collections should not raise any substantive or policy issues or go beyond the methods specified in this generic ICR. Any individual collection that would require policy or methodological review is inappropriate for expedited review under this generic clearance and must go through the full PRA process. For example, a generic clearance is not appropriate for the collection of influential or policy related information and is probably not appropriate for large collections involving many respondents and high respondent burden.


Examples of collections that would not generally fall under the Programmatic Clearance Process are: (a) surveys that will be used for making significant policy or resource allocation decisions; (b) collections whose results are intended to be generalizable to the population of study (c) collections that impose significant burden on respondents or significant costs on the Government; (d) collections that are on potentially controversial topics or that raise issues of significant concern to other agencies; (e) collections that are intended for the purpose of basic research and that do not directly benefit the agency’s customer service delivery; and (f) collections that will be used for program evaluation and performance measurement purposes.


Who Will Be Impacted by This Renewal?


General Public: Anyone (in-person or virtual) who uses NPS resources, products, or services. This includes individual households, representatives of the private sector, Tribes, academia, and other government agencies. Depending upon their role in specific situations and interactions, citizens and NPS stakeholders and partners may also be considered customers.


Stakeholder: Any groups, individuals, or Tribes who have an expressed interest in and who seek to influence the present and future state of NPS resources, products, and services.


Partner We define to mean those groups, individuals, Tribes, and government agencies who are formally engaged in helping NPS accomplish its mission, or with whom NPS has a joint responsibility or mission.


Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.


The National Park Service (NPS) is requesting an extension/renewal of this Programmatic Clearance for NPS-Sponsored Public Surveys (OMB Control Number 1024-0224). The NPS is required by the National Park Service Act of 1916 (39 Stat 535, 16 CSC1, et. Seq.) to preserve national parks for the use and enjoyment of present and future generations. At the park level, this means resource preservation, public education, facility maintenance and operation, and such physical development as is roughly in proportion to the seasonally adjusted volume of use and in consideration of visitor characteristics and activities for determining park carrying capacity. Other federal rules (National Environmental Policy Act, 1969 and NPS Management Policies, 2006) require input from the public when assessing the impact of development on users, potential users, and residents near park as part of each park’s General Management Plan. These laws, policies, and regulations dictate periodic surveys of national park visitors, potential visitors, and residents of communities near parks.


The general scope of the Programmatic Clearance will remain unchanged and will continue to include individual surveys of park visitors (current, past, potential, and now virtual) and residents of communities near parks and in some cases individuals living elsewhere in the United States. The use of the Programmatic Clearance will continue to be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process.


Under the Programmatic Clearance, an alternative set of practices and procedures is employed by which OMB determines whether or not to approve proposed surveys of park visitors, potential park visitors, and/or residents of communities near parks. All questions asked of ten or more members of the public must fall within the scope of 12 topic areas and be drawn from the Pool of Known Questions.


OMB reviews NPS procedures for these surveys as a program of study for the purpose of overall clearance. OMB also reviews each individual survey instrument certified and submitted by NPS as part of the program. NPS, with DOI and OMB monitoring, conducts the necessary quality control through peer review of appropriate program elements. NPS also maintains an information base of public surveys conducted in parks to be used to increase the efficiency of future surveys. All approved survey instruments and final survey reports are archived with the NPS Social Science Program as part of the Social Science Studies Collection. This collection is available to researchers. Documents in the collection are currently available in digital format through the NPS Survey Request Solution – SRTS (https://irma.nps.gov/Srts/).




Legal Justification:

  • The National Park Service Act of 1916, (54 USC 100702.) (Previously 16 U.S.C. 1a-7) requires that the National Park Service (NPS) preserve the national parks for the use and enjoyment of present and future generations. At the field level, this means resource preservation, public education, facility maintenance and operation, and physical developments that are necessary for public use, health, and safety.


  • National Environmental Policy Act of as amended in 1982 (Sec 102 [42 U.S.C. § 4332A])

The Federal Government shall utilize a systematic, interdisciplinary approach which will insure the integrated use of the natural and social sciences and the environmental design arts in planning and in decision making which may have an impact on man's environment


  • The Government Performance and Results Act of 1993 (P.L. 103-62). GPRA goals IIa1 and IIb1:

IIa1: Visitors safely enjoy and are satisfied with the availability, accessibility, diversity, and quality of park facilities, services and appropriate recreational opportunities.


IIb1: Park visitors and the general public understand and appreciate the preservation of parks and their resources for this and future generations.


Executive Order 12862 -- “Setting Customer Service Standards”

This Executive Order of September 11, 1993, is aimed at “ensuring the Federal Government provides the highest quality service possible to the American people.” The E.O. requires surveys as a means for determining the kinds and qualities of service desired by the Federal Government’s customers and for determining satisfaction levels for existing service.


Executive Order 13571 – “Streamlining Service Delivery and Improving Customer Service”

This Executive Order of April 27, 2011, mandates “establishing mechanisms to solicit customer feedback on Government services and using such feedback regularly to make service improvements.”


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.


The Programmatic Clearance Process is limited to applied research that will be used to answer specific questions that have direct applications to the NPS visitor use or management. The focus will be on non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Basic research driven purely by curiosity and a desire to expand social science research knowledge that is not to be directly applicable to current NPS management and planning needs will be directed to the full review process. The information collected will be used to improve the service and products that NPS provides to the public. Park managers and planners have used these data to support all aspects of visitor use, planning, monitoring, interpretation and education.


Examples of significant, sensitive, or political issues include: seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic.


The Programmatic Clearance is intended to collect applied visitor use research data (qualitative or quantitative) only. Therefore, survey instruments that will be approved under the authority of the clearance must focus on visitor use and satisfaction data. No information collection instruments seeking to collect information beyond the scope of visitor data will be considered under the scope of this Programmatic Clearance. Information collections that are experimental in nature or design (e.g., testing methods or developing a new method) will not be accepted and will be returned to be submitted through the normal PRA process.


All information collection instruments will be designed and deployed based upon acceptable statistical practices and sampling methodologies, and will be used to obtain consistent, valid data that are representative of the target populations and account for non-response bias, according to the most recent OMB guidance on “Agency Survey and Statistical Information Collections.”


The NPS Social Science Program will continue to provide technical and administrative review of proposed surveys and communicate review comments to investigators. In some cases, NPS may recommend that submitted proposals undergo review under the full PRA process, rather than the programmatic clearance.


If, after consultation with investigators, a proposed survey is recommended for approval by the Social Science Program, NPS will transmit the survey instrument and certification to OMB. OMB has agreed to provide a response within ten working days of receiving the submission package. Once it has received approval from OMB, NPS will assign the OMB control number and survey completion date. Typically, the survey completion date assigned is six months from the projected end date of the survey. NPS will notify the investigator that the survey is approved.


Uses of the Information

The Programmatic Clearance benefits the NPS by providing information concerning the following management and planning topics:

  • Service needs of customers

  • Strengths and weaknesses of services

  • Barriers and constraints to achieving customer service standards

  • Changes to customer service standards

  • Changes in service delivery over time

  • Improving public trust in government


The scientific communities that partner with the NPS in administering surveys benefit through:

1) efficient, effective, and timely review process

2) focus on peer review that improves the quality of information collections

3) increased attention to methodological improvements and use of best practices

4) better administration and wider sharing of information obtained from surveys of the public

5) a renewed confidence and willingness to complete the review process


Typical Information Collection Methods


  1. In-person or on-site intercept Surveys: In a face-to-face situation, a survey instrument is provided to a respondent who will be instructed to complete it while on site and then return it to a specified person or location. This may include oral administration or the use of electronic technology and kiosks. The survey proctor will be prepared to answer any questions the respondent may have about how to fill out the instrument but does not interfere or influence how the respondents answer the questions. All information collections in this category will be restricted to last no more than 15 minutes. There will be very few exceptions and those will be considered on a case by case basis. This burden must be verified by evidence of pretesting with subjects not familiar with the development of the study. On-site surveys should be implemented in a manner that is consistent with methods that will consider respondent fatigue. Previous survey results without an examination of respondent burden will not be considered as the sole justification for approval.


  1. Mail Surveys: Using existing lists of customer addresses, a three contact-approach based on Dillman's “Tailored Design Method” will be employed. The first contact is a cover letter explaining that a survey is coming to them and why it is important to the agency. The second contact will be the survey instrument itself along with a postage-paid addressed envelope to return the survey. The third contact will be a reminder postcard sent 10 days after the survey was sent.  Finally, the respondents will receive a letter thanking them for the willingness to participate in the survey and reminding them to return it if they have not already done so. At each juncture, the respondents will be given multiple ways to contact someone with questions regarding the survey (including phone, web, and email). If the survey has been lost, the respondent can request that another be sent to them. Electronic mail is sometimes used instead of postal mail to communicate with customers. Although this is a cost-effective mode to survey a large group of people, it does not usually generate the best response rate. Telephone calls to non-respondents can be used to increase response rates.


  1. Telephone Surveys: Existing or created databases that include telephone numbers will be used to contact potential respondents. An interviewer will use an approved dial back method until someone is available. Telephone surveys are generally reserved for hard to reach interviewees, or to follow-up with non-respondents.


  1. Focus groups and Face-to-face Interviews: Selected individuals will be invited to participate in small group discussions or one-on-one interview sessions. A script is generally used to facilitate the discussions and is designed to encourage respondents to talk about experiences, preferences, needs, observations, or perceptions. A moderator whose role is to foster interaction leads the conversation. The moderator makes sure that all participants are encouraged to contribute and that no individual dominates the conversation. Furthermore, the moderator manages the discussion to make sure it does not stray too far from the topic of interest. Focus groups are most useful in an exploratory stage or when the bureau/office wants to develop a deeper understanding of a program or service. Using the best in focus group research practices, groups will be constructed to include a cross-section of a given customer group


  1. Web-based and on-line survey: For products or services that are provided through electronic means, whether e-commerce or web-based information, a web or email survey may be most appropriate.  During the course of their web interaction, users can volunteer to add their names to a list of future surveys. From this list recognizing that the group may be self-selected, a respondent pool will be selected in accordance with the sampling procedures outlined above.  An email will be sent to them explaining the need and importance of the survey with a web link to the survey. Within 5 days, a follow-up email will be sent to the respondents reminding them to complete the survey. Finally, the respondents will receive an email thanking them for the willingness to participate in the survey and reminding them to complete it if they have not already. The respondent will always have the option to submit the survey in paper form, should they elect to do so.


# surveys conducted each year

2013

20141

2015

Mail Surveys:

15

22

7

On-site Surveys

11

6

8

Telephone Surveys:

0

0

2

Focus groups and Face-to-face Interviews

1

6

3

Web-based and on-line surveys:

0

2

2

Total

27

36

21


Types of Questions asked

There are 11 social science related topic areas that are used to guide the development of survey instruments; however, the topic areas are covered by five general overarching themes. The themes were used to guide the development of the Topic Areas. The five themes are:


1. General demographics: General demographic information may be gathered in order to better understand the respondent within the context of the collection. Demographics data will range from asking customers how many times they have used or visited an NPS site within a specific timeframe, to their ethnic group and race. Sensitivity and prudence will be used in developing and deploying questions under this topic area so that the respondent does not perceive an intrusion upon his/her privacy. Additionally, these questions will only be asked as long as the data are critical to understanding customer satisfaction and the character of the respondent base. Demographics may also be used as part of a non-response bias strategy to ensure responses are representative of the contact universe.


2. Delivery and use of products, information, and services: The information requested will be used to target areas such as: timeliness, appropriateness, accuracy of information, courtesy, efficiency of service delivery, and resolution of issues with products. Responses will be assessed to plan and inform efforts to improve or maintain the quality of service offered to the public. The NPS is also interested in respondents’ opinions concerning the accessibility and accuracy of the information provided either on-site or virtually as well as their feedback regarding how well programs are administering specific processes.


3. Management practices: This area covers questions relating to how well customers are satisfied with management practices and processes at NPS sites, what improvements they might make to specific processes, and whether or not they feel specific issues were addressed and reconciled in a timely, courteous, responsive manner. Questions within this area may involve feedback regarding how well NPS engaged respondents on a specific topic.


4. Mission management: Questions will ask for feedback regarding how well we are carrying out our responsibilities to protect and manage public lands as in our mission statement. Questions will specifically ask customers to provide responses related to our capacity to protect, conserve, and preserve natural, cultural, and recreational resources that we manage.


5. Rules, regulations, policies: This area will focus on gaining insight regarding fairness, adequacy, and consistency in rules, regulations, and policies for which NPS is responsible. It will also help us understand public awareness of rules and regulations and whether or not they are explained in a clear and understandable manner. It will not seek opinions from customers regarding the appropriateness of regulatory rulings themselves.

TOPIC AREAS WITHIN THE SCOPE OF THE PROGRAMMATIC CLEARANCE

To qualify for the programmatic clearance process, all questions in a survey must fit within one or more of the approved topic areas and must be approved by the NPS and OMB. Researchers have flexibility, within accepted standards of good survey design and OMB regulations, to develop specific questions within the topic areas. The 11 topic areas are identified below. A description of the scope of each topic area follows.


TOPIC AREA 1: Respondent Characteristics

The questions in this section will be used to characterize the population of respondents participating in each sample. Individual characteristics collected will be attributes of individual park visitors or visitor groups, potential visitors or groups, and residents of communities near parks. Individual characteristics collected will be relevant and limited to the mission, management, and/or operations of National Park System units. The scope of the information will be limited to those that are germane to the topic being studied and relevant to the park and its management. Variables such as age, education, and knowledge are often good predictors of demand and visitation behavior.


TOPIC AREA 2: Trip Planning

The section on Trip Planning includes aspects of travel which affect a trip or decisions which individuals make prior to, during, or following their trips to parks, related areas, and nearby communities. Trip characteristics will be relevant to the mission, management, and/or operations of National Park System units that are included in the scope of this topic area. The scope of the information collected will be limited to those that are germane to the topic being studied and relevant to the park and its management.


TOPIC AREA 3: Transportation

The questions in this section will be used to fulfill management goals needed to develop strategies to meet transportation needs. These strategies address current and future land use, economic development, traffic demand, public safety, health, and social needs.


TOPIC AREA 4: Trip Characteristics

This topic area will address five high level questions: (1) who travels to National Parks (2) what information sources are used during visits to National Parks (3) when is technology used during a visit (4) how is transportation used by visitors at their destination, and (5) length of stay, number of people in the travel party, and travel mode.


TOPIC AREA 5: Recreation History

Individuals participate in many activities during their visits to parks, related areas and nearby communities. The questions in this section will be used to identify individual activities, behaviors, or uses of natural and cultural resources which are relevant to the mission, management, and/or operations of National Park System units. Understanding the current and future uses and purposes of park visitors will be helpful to managers when considering updating park management plans and educational efforts.


TOPIC AREA 6: Visitor Experiences

Crowding and conflict are among the most intractable problems faced by recreation mangers. Concern over rising visitation in parks, and accompanying impacts on resources and on visitor experience, has led the National Park Service to focus increasing attention on the concept of crowding and carrying capacity. Crowding and conflict arise from the social encounters and interactions among recreationists ranging from basic competition for space (e.g., crowding) to conflicts between forms of activity and related expressions of acceptable or appropriate use. Research on crowding norms and conflict has been particularly helpful in establishing guidelines for “Limits of Acceptable Change” planning efforts by identifying key social impact indicators and the data required to monitor them. The questions in this topic area will help managers understand the factors associated with the acceptability of crowding and visitor carrying capacity.


There may be scenarios were local visitors, who visit a site more frequently and spend less overall in the local tourism economy, have different perceptions on crowding than visitors traveling from afar. Those interpreting survey results should be aware of such patterns when making management decisions. Therefore, questions on crowding should be paired with questions on the trip origination point of the respondent and perhaps at least a coarse sense of expenditures.


TOPIC AREA 7: Evaluation of Services and Programs

Public opinion of the services and facilities helps management teams understand the values people hold in relation to park resources and the visitor experience and is critical to creating a plans that can be successfully implemented. Underlying all fundamental planning decisions are competing values, which must be resolved by a decision as to which value is of greater importance in a particular situation. A planning decision is the compromise between competing values at a given point in time. Understanding public values enables the management teams to make informed planning decisions. The questions in this topic area will be used to help managers learn about public concerns, issues, expectations, and values.


{NEW} TOPIC AREA 8: Human Dimensions of Natural Resources

Questions in this topic area will be used to collect information concerning the public's awareness and observations of the natural and social environments in the parks. Preferences, motives and attitudes will be measured to determine how individual observations influence overall experiences. For purposes of the programmatic clearance process, perception questions will be limited to topics the park or the NPS can manage and control as well as current or potential goods and services. The questions in this topic area could be used to provide information that will provide resource managers with an understanding of the public’s values, perceptions and beliefs as well as the social consequences of management actions.


{NEW} TOPIC AREA 9: Natural/Cultural Resource Management

Natural and Cultural Resource Management offers research and opinions on use and conservation of natural resources, protection of habitats and control of hazards, spanning the field of environmental management without regard to traditional disciplinary boundaries. The questions in the section will aim to identify factors affected by conflicts that rise between meeting visitor/public needs and protecting resources.


TOPIC AREA 10: Expenditures

Visitor expenditure and income information is needed to calculate the economic impact and benefit of park visitation. Economic impact measures how much the money people spend visiting parks and surrounding areas contributes to the local economy in terms of jobs and income. Economic benefit measures how much visitors value a park above what they spend, or the increase in social welfare a park provides. Accurate impact assessment requires identification of those portions of expenditures that occur in the local region and inside the park. Questions will be used to develop average spending estimates for visitors to the park and local region. Income and income forgone questions provide information about the value, or economic benefit, of a visit to the park.


The use of questions that employ stated preference or stated choice techniques to estimate consumer surplus values and non-market values associated with park visitation is outside the scope of this approval. The responses to the questions in TOPIC AREA 10 are not intended to be used or combined with any other survey responses outside the scope of the proposed survey request. Results should only be aggregated to the population of visitors to the specific park unit for which the approval is granted. There should be no attempt to disaggregate any values to generalize the results above or beyond of the scope of the intended proposed and approved purpose.


{NEW} TOPIC AREA 11: Constraints and Barriers for Non-Visitors

Over the years, the NPS has used the programmatic clearance process to develop an understanding of current visitors which continues to build a detailed picture of its core users. In order to underpin and complement this knowledge, research is necessary to develop a better understanding of the non-visitor market, including lapsed and invisible visitors. Specifically, the research was required to identify the demographic and regional profile of these groups, and to clarify the relationship between perceptions of the National Parks, key influences and the decision making process. In order to optimize the usefulness of the program, additional surveys should include non-visitors that also include virtual visitors, underserved communities and stakeholders so that comparisons can be made and insight drawn as appropriate.

{NEW} Topic Area 12: Safety and Injury Prevention

This is a brand new topic area that will be used to explore visitor awareness, knowledge and perception of safety and injury prevention measures: Questions are tailored to cover aspects of individual activities and behaviors, and the acceptability of current safety practices. Understanding the factors associated with visitor behavior and perceptions for public risk management are critical to enforcement, education and emergency services that can be successfully implemented to reduce injury in parks


Four new topic areas have been added to the Pool of Known Questions for review and consideration of approval: 1) Human Dimensions of Natural Resources, 2) Natural and Cultural Resource Management, 3) Constraints and Barriers for Non-Visitors and 4 Safety and Injury Prevention. Each of these topic areas are well embedded in the lexicon of social science research. There are no controversial or sensitive questions included in these topic areas. The contribution of the areas will add to the focus of how people’s knowledge, values, and behaviors influence and are affected by decisions about the use, conservation, and management of national parks.


How are data used?

Historically, managers and program specialists have used these data to identify:

  • needs

  • levels of understanding and knowledge

  • ideas or suggestions for improvement

  • barriers and constraints to achieving customer service standards

  • perceptions and values

  • baselines to measurements to observe changes over time

  • spending behaviors


Submittal Process

The NPS sponsor for each collection is required to ensure that the requirements for complete submission are satisfied before submitting the collection under the Programmatic Clearance Process for NPS Public Surveys. The requirements include a completed submission form and the final version of all documents that will be used in the collection (questionnaires, interview scripts, maps, photographs, correspondences, etc.). The submission form serves as the supporting statement for the collection. It provides the justification, sampling dates and locations, sampling procedures and respondent burden


The Principal Investigator (PI) will typically initiate the process by filling out the form and developing the survey instrument; however, the park sponsor is responsible for reviewing the form and the instruments to ensure that park policy, purpose/need, and relevancy is generally guiding the collection.

The Social Science Program will continue to encourage investigators to use the Pool of Known Questions developed for this process when it is consistent with the purpose of their studies. We acknowledge that there are other questions, such as those measuring visitor experience, use history, and travel behaviors, etc., that have been used and validated in numerous recreation surveys. Any variations of questions within each Topic Area will be considered (on a case-by-case basis) and the applicability will be determined before submitting the final version to OMB for its review and consideration of approval.


Programmatic Clearance Workflow

*NPS ICRC – NPS Information Collection Review Coordinator *PI – Principal Investigator *ICR – Information Collection Request


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.


Individual surveys conducted under this Programmatic Clearance will vary in the methods used to contact the public. At least 70% of the surveys conducted under this program will consist of on-site and or mail back surveys. About 50% of those surveys will offer electronic response options (e.g., Survey Monkey, Qualtrics, etc.). About 25% will be collected by way of face to face or telephone interviews or small focus groups; and the remaining 5% will be in the “other” category. In all cases, appropriate non-response bias strategies will be used to ensure that responses are representative of the contact universe.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


This effort attempts to be sensitive to any duplication of efforts being done by other entities. Any possible duplication will be examined during the technical and administrative review of individual submissions. We continue to work with the NPS research permitting administrators to link the information collection process and the permitting process so that the submissions that may have an information collection component from other non-federally sponsored programs are known. In the past, some research conducted by Universities or Non-Governmental Organizations (NGO) that included surveys of park visitors were given research permits but were not reviewed by the NPS Social Science Program. We are actively working to close that gap. The first goal of this effort is to identify duplication onsite and within the agency.


We acknowledge that there are and have been other collections and programs2 that are used to provide information about the outdoor recreation patterns on a national level; however, these collections do not typically cover the types of management and planning issues that are central to individual units of the National Park System.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


A significant advantage of this process is to gather information needed without putting additional burden on small entities. Sampling will be used and the number of questions on the surveys will be kept to a minimum. Use of electronic means also potentially reduces the burden on small entities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This Programmatic Clearance Process for NPS Sponsored Public Surveys has allowed the NPS to successfully navigate the PRA process in an expedited manner. This process simplifies and streamlines the information collection requests to OMB in a manner that allows the NPS to submit more requests per year than we would through the regular submission route. Surveys are reviewed and approved in an expeditious manner allowing data collections to occur more frequently and in a timely manner – more specifically during the visitation season of interest. In the 16 years of the programmatic approval on average, 40 new surveys have been approved each year in support of NPS management and planning. This is nearly five times as many as we would expect going through the regular submission route.


Since 2012, approval was typically granted in less than 14 days from the date the NPS Information Collection Review Coordinator (ICRC) submits the ICR to OMB for review. From FY 1999 through FY 2015, this generic ICR process has produced an estimated annual cost savings to the Federal government of about $2,500,000.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


This renewal request contains no special circumstances with respect to 5 CFR 1320.5 (2) (i) and (iii-viii) with the exception of (ii). We may be asking respondents to send back their responses in fewer than 30 days after receipt of the survey.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


On May 29, 2015, a 60-day Federal Register notice (80 FR 30720) was published announcing this information collection renewal. Public comments were solicited for 60 days ending July 28, 2015. We received one public comment in response to that notice. The commenter did not support the renewal of the collection and stated that “once every 5 years is often enough to take these surveys.” In response to this comment, NPS contends that this renewal is not for a single annual survey; rather it facilitates a set of independent surveys administered at multiple sites for a variety of NPS management needs. In the 16 years of the programmatic approval on average, 40 new surveys have been approved each year in support of NPS management and planning.


In addition to the request for public comment, we informed individuals who had served as principal investigators on NPS-sponsored public surveys during FY 2012-15 that the 60-day Federal Register notice was published. After receiving no written comments the NPS telephoned 5 previous submitters to specifically ask about their impressions of the Programmatic Clearance Process. The overarching theme from the commenters was in support of continuing the programmatic approval process and suggested that an updated version of the Pool of Known Questions would be appreciated.


These comments were encouraging because we were working to resolve the issue concerning the Pool of Known Questions. In 2014, we requested and were granted a one-year extension of our previously approved collection. We requested this extension because at that time users of the Pool of Known Questions began to complain that “many of the questions are considered to be outdated or underused” and “there are other topics areas that should be included (e.g., interpretation and education, evaluation and human dimensions)”.


The extension period was used to develop a working team of 11 social scientists (one for each topic area) that were familiar with the program. The researchers were also selected because of their technical knowledge and experience in the topic areas. During this effort, each of the recruited researchers was asked to develop a sub-working group of peers to help revise, re-write and re-design the Pool of Known Questions for their topic areas.


The questions in the revised Pool of Known Questions came primarily from the following three sources:

  1. Current and expired OMB approved recreation based surveys (all federal agencies)

  2. Previous non-federally funded studies conducted by the working group members

  3. Surveys found in peer-reviewed literature

  4. Variations of any question in the currently approved Pool of Known Questions approved by OMB between 2013 and 2015.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Incentives, remuneration, and gifts are generally deemed inappropriate within the scope of the Programmatic Clearance Process for NPS Sponsored Surveys. In some cases, the provision of gifts and incentives to respondents may appear to be a conflict of interest. However, there may be extraordinary circumstances under which remuneration may be appropriate within the scope of this program. In the event that there are collections that seek to use incentives, the program manager will be required to justify the purpose and need of the proposed incentive; the proposed purpose will be reviewed and considered.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


We will not provide any assurance of confidentiality to any respondents. Data collected will only be reported in aggregates and no individually identifiable responses will be reported.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The questions used on these surveys will not be of a sensitive nature.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


Based on experience with the existing NPS Programmatic Clearance, we estimate that there will be approximately 41,500 annual respondents. Given these estimates, NPS anticipates a budget of 11,283 hours per year for these proposed collections. This renewal also includes nine surveys currently approved under this OMB control number, totaling 13,700 annual responses and 3,278 annual burden hours.


We estimate the total dollar value of the annual burden hours for this collection to be $357,106 (rounded). We arrived at this figure by multiplying the estimated burden hours by $31.65 valuation of volunteer time and the projected burden hours. This wage figure included benefits and is based on the National Compensation Survey: Occupational Wages in the United States published by the Bureau of Labor Statistics (BLS) Occupation and Wages for average full compensation for private industry, (hour costs based on BLS news release USDL-11-849 for Employer Costs for Employee Compensation—September 9, 2015 at – (http://www.bls.gov/news.release/pdf/ecec.pdf), dated June 2015).


TABLE 1. Total Estimated Annualized Burden


Annual Number of Responses

Estimated Completion1 Time per Response

Total Annual Burden Hours

Dollar Value of Burden Hour Including Benefits

Total Dollar

Value of

Annual Burden

Hours2

On-site Surveys

20,000

15 minutes

5,000

$31.65

$158,250

Mail-back surveys

10,000

20 minutes

3,333

$31.65

$105,489

All non-response surveys

6,500

3 minutes

325

$31.65

$10,286

Telephone Surveys

1,000

30 minutes

500

$31.65

$15,825

Focus Groups/In person interviews

1,500

60 minutes

1,500

$31.65

$47,475

Other

2,500

15 minutes

625

$31.65

$19,781

Annual TOTAL

41,500


11,283


$357,106


3 YEAR Total

124,500


33,849


1,071,318

1Average time 2Total Hours Rounded


13. Provide an estimate of the total annual non-hour cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government or (4) as part of customary and usual business or private practices.


There is no non-hour cost burden, record keeping nor any fees associated this collection.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


The total annualized cost to the Federal government is estimated to be $1,525,780. This estimate is based upon our experience and the program’s determination of the development and execution of each collection. Because this is determined on a case-by-case basis we have assumed a ratio of 21 hours of time per Federal worker associated with the development of a programmatic submission. We estimate that there will be 20 submissions in FY16. This estimate is based on the average of the number of annual submissions between FY12 through FY15.


The table below shows Federal staff and grade levels performing various tasks associated with this information collection. We used the Office of Personnel Management Salary Table 2015 General Schedule (GS) Locality Pay Tables to determine the hourly rate (see: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2015/DEN_h.pdf). We multiplied the hourly rate by 1.5 to account for benefits (as implied by the BLS news release mentioned above).


Position

Grade/

Step

Hourly Rate

Hourly Rate incl. benefits

(1.5 x hourly pay rate)

Estimated time (hours) per submission

Cost

Per submission

Annual

Cost

(x20)

Program Manager

12/5

40.92

61.38

21

1,289

$25,780


We estimate that the operational cost to the Federal Government to be $1,525,780. This cost includes the expenses listed in Table 2 below. This includes non-federal FTE, travel, equipment and operating costs associated with this information collection (Table 2).


Table 2. Costs associated with this information collection


Operational Expenses

Cost

Per submission

Annual

Cost (x 20)

Researcher/Principal Investigator

$25,000

$500,000

Contracts and Support

(Survey Design and Development, Survey Administration,

Data Collection, Data entry, Data analysis and Reporting)

$50,000

$1,000,000

TOTAL

$75,000

$1,500,000


15. Explain the reasons for any program changes or adjustments.


The numbers show a burden increase because this is a three-year request and the burden thus is for three years of surveys via our generic clearance program, whereas the previous request/approval was for only one year. However, the actual annual burden is estimated to decrease because we expect to have fewer survey submissions per year over the next three years, in part, due to the ending of our contract with the Visitor Service Project (VSP) at the University of Idaho.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Each information collection considered under the Programmatic Clearance Process will use a submission form to describe the proposed collection. The information will include a justification, location, sampling methods, and respondent burden. Each submission will include a method to check for non-respondent bias and the intended use of the results.


Each information collection will provide an explanation of the specific tabulation methods to be used to synthesize, analyze, and aggregate the data collected. The data will be gathered primarily for internal NPS use, so it is not expected that such data will be published. However, if the results of a particular survey are to be published or otherwise made public, that fact will be disclosed in the completed Justification Form for that survey.


The analyses will typically include response frequencies, means, standard deviations, and confidence intervals used to address concrete management and planning issues. In the cases when expenditure data is collected, the NPS Money Generation Model (MGM) may be used. This model is used to estimate the economic impact of visitor spending on gateway regions. In some cases, more complex multivariate statistical analyses are performed, as when estimating coefficients for models based on the Theory of Planned Behavior. In other cases, data from qualitative studies may involve transcripts of interviews or focus group discussions, followed by content analyses to identify general themes.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB’s expiration date will be displayed on the information collection instruments.


18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."

We are requesting no exceptions to the certification statement.

1 Visitor Services Project (VSP) contract ended

2 These include the: National Survey of Hunting, Fishing, and Wildlife and Recreation (#1018-0088); National Survey on Recreation and the Environment (#0596-0127); NPS Comprehensive Survey of the American Public (NPS); the National Visitor Use Monitoring Program (#0596-0232)and the Interagency Recreation And Transportation Survey (#0596-0236)

23


File Typeapplication/msword
File Modified2016-04-14
File Created2016-04-14

© 2024 OMB.report | Privacy Policy