NPS Form 10-201 (Rev. 09/2016) OMB Control No. 1024-0224
National Park Service Expiration Date XX/XX/XXXX
PROGRAMMATIC REVIEW AND CLEARANCE PROCESS
FOR NPS-SPONSORED PUBLIC SURVEYS
The scope of the Programmatic Review and Clearance Process for NPS-Sponsored Public Surveys is limited and will only include individual surveys of park visitors, potential park visitors, and residents of communities near parks. Use of the programmatic review will be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process. Additionally, this process is limited to non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Examples of significant, sensitive, or political issues include: seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic National Wildlife Refuge.
SUBMISSION DATE: 10/6/2020
PROJECT TITLE: Virtual Visitor Study Pre-Test
ABSTRACT: (not to exceed 150 words)
The Virtual Visitor Pre-Test study will test an online survey instrument to “virtual visitors” of NPS digital assets including NPS.gov, park-managed social media accounts, and users of Harpers Ferry Center (HFC) developed apps. The study’s core objectives are to characterize the NPS virtual visitor population, describe their motivations, and determine which platforms are best meeting visitor expectations. The pre-test will assess the instrument’s suitability to address these objectives and allow for any necessary refinement of the survey instrument and administration methods. The outcomes of this overall research effort will position the NPS to strategically deploy resources where digital information needs are not being sufficiently met, to strengthen programs where the highest impact is being provided, and to establish a stronger virtual visitor monitoring program that is consistent with the physical visitor use monitoring program already in place.
PRINCIPAL INVESTIGATOR CONTACT INFORMATION:
Name: Kevin Hathaway Title: Vice President
Affiliation: RSG, Inc. Phone: 802.295.4999
Address: 55 Railroad Row White River Junction, VT 05001
Email: [email protected]
PARK OR PROGRAM LIAISON CONTACT INFORMATION:
Name: Todd Edgar Title: Solutions Architect
Affiliation: Web Services Division Phone:
Address:
Email: [email protected]
PROJECT INFORMATION:
Where will the collection take place? Online (NPS.gov, HFC apps, Facebook, Instagram, Twitter)
Sampling Period Start Date: October 2020 Sampling Period End Date: November 2020
Type of Information Collection Instrument: (Check ALL that Apply)
Mail-Back Questionnaire Face-to-Face Interview Focus Groups
On-Site Questionnaire Telephone Survey
Other (List) Online Survey
Will an electronic device be used to collect information? No Yes – Type of Device: Respondents’ personal devices (including smartphones, laptops, desktop computers, tablets, etc.)
SURVEY JUSTIFICATION:
Social science research in support of park planning and management is mandated in the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). The NPS pursues a policy that facilitates social science studies in support of the NPS mission to protect resources and enhance the enjoyment of present and future generations (National Park Service Act of 1916, 38 Stat 535, 16 USC 1, et seq.). NPS policy mandates that social science research will be used to provide an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources. Such studies are needed to provide a scientific basis for park planning and development.
The National Park Service (NPS) directly manages thousands of digital media web pages, mobile apps, and social media channels. The NPS also supplies content, directly or indirectly, for third-party providers that share digital media information with the public. Within this large digital ecosystem, millions of online users around the world rely on these resources to quickly and easily access a wide range of information. Furthermore, the second century of the NPS has involved a greater focus on community engagement beyond physical visitation alone; engaging the American public through digital opportunities and activities is increasingly recognized as an important opportunity for the NPS to build relationships with individuals who may have limited awareness of, interest in, or ability to visit national parks themselves. It is therefore increasingly important that NPS’s universe of digital assets be assessed according to user experience in order to determine their value to the public.
While the NPS has an established program for measuring and monitoring physical visitation to its park units, much less is known about virtual visitor behavior and experience across all NPS digital assets. This project will evaluate NPS’s primary public-facing digital platforms that include NPS.gov, park-specific social media accounts, and official mobile apps.
The objective of this request is to survey NPS’s virtual visitor population to describe their motivations for using specific digital platforms, and to help determine if the current suite of digital media resources meets NPS objectives. To date there is no information available to measure virtual visitor behavior and sentiment across the full universe of NPS digital assets. This request is a pre-test effort to test the overall approach, sampling plan, administration strategies, and the survey instrument. The pre-test effort will help to ensure the survey will be conducted in a manner that maximizes the ability to achieve the subsequent full-scale study’s overall objectives.
This pre-test will test and validate the following:
Administration methods: Fielding the survey across the distinct digital platforms and social media accounts will require coordination with asset managers throughout the NPS organization. While the survey will be centrally hosted on a dedicated web platform, the actual disbursement of invitations to the public will be carried out at the asset or resource manager level within NPS. A primary outcome of the pre-test will be to test and refine the proposed administration methods to ensure efficient, smooth, and error-free distribution of survey invitations to a diverse cross-section of virtual visitors.
Response rates: Click and response rate assumptions will be based on conservative estimates obtained through similar research, and the pre-test will be used to assess those assumptions in order to guide data collection efforts for the full launch. During the pre-test phase, click rates, social media engagements, completion times, and response rates will be carefully monitored to better assess the level of effort and dissemination required to successfully collect the full sample.
Survey answers: The study team will conduct a thorough analysis of responses to the survey questions to assess the instrument’s design, logic, respondent comprehension, and performance at meeting the study’s overall objectives. The questionnaire will contain channel-specific (Facebook, Twitter, Instagram, HFC apps, or NPS.gov) questions that will be evaluated separately. The results of the pre-test will inform any necessary adjustments or updates to the instrument before the full administration.
Device specific differences: Although the instrument will be designed to operate seamlessly on mobile devices, the pre-test will be used to assess the approach by looking for any device-specific differences in responses. This could entail examining survey completion times and dropout rates, both by device and by the digital resource the respondent was recruited from.
We will use the results of the pre-test to revise the instrument and methods and will submit a subsequent IC request to OMB for review and approval for full implementation.
SURVEY METHODOLOGY:
Respondent Universe:
The respondent universe for this data collection effort will be any virtual visitors, age 18 or older, who meet the following criteria over the course of the sampling period:
View NPS digital content on NPS.gov.
Use an HFC-developed app on their mobile device.
View social media accounts (Facebook, Twitter or Instagram) managed by one of thirty-four participating park units that have agreed to post invitations to the survey on those platforms.
Sampling Plan / Procedures:
The survey will be administered using a web-based instrument. Invitations to participate in the pre-test survey will be distributed through platform-specific outreach strategies described below. It is anticipated that the administration effort will take place over approximately four weeks. The pre-test effort will aim to collect 1,460 completed questionnaires, with a target of 292 completed responses per platform (Table 1).
Table 1: Pre-test Survey Target Completes by Platform Type
Platform Type |
Target Complete Surveys |
NPS.gov |
292 |
HFC Apps |
292 |
Social Media (Facebook)) |
292 |
Social Media (Instagram) |
292 |
Social Media (Twitter) |
292 |
Total |
1,460 |
Because of differences in how virtual visitors interact with the content on each platform, it is necessary to implement platform-specific sampling strategies to invite virtual visitors to participate in the survey. Platform-specific hyperlinks will be developed for digital administrators of each resource to post or pin on platform pages, tweet directly into feeds, or provide via a website pop-up. The hyperlinks to the survey will be included within brief invitation text that will include NPS branding and concisely describe the survey and the benefits of participation. The unique hyperlinks will allow the survey instrument to automatically detect which platform the respondent has been directed from and permit the appropriate survey logic and question branching. Sampling strategies for each platform are described below.
NPS.gov: Respondents recruited at NPS.gov will receive a pop-up invitation in their web browsers that includes the platform-specific hyperlink to enter the survey. This invitation will be shown at random to 0.5% of visitors who arrive and view two or more pages on the NPS.gov website. From January through March 2020, NPS.gov had more than 39 million individual page views from 19 million unique sessions where visitors visited two or more pages, corresponding to roughly 208,000 candidate sessions per day. It is projected that obtaining a sample of at least 292 completed questionnaires will require selecting approximately 29,200 website sessions to be shown a pop-up invitation over the study period (for more information, see Section D, Expected Response Rate/Confidence Level). By keeping selection of visitors random, the distribution of visitors who are sent an invitation is expected to approximate the true population of visitors accessing NPS.gov resources. Table 2 shows expected invitations and daily contacts for respondents recruited through NPS.gov visits.
HFC Apps: HFC currently manages the development and maintenance of 28 official NPS mobile applications that the public may download to their smart phones, tablets, and e-readers. Most HFC apps have been developed on behalf of individual parks and include features such as interactive visitor maps, park alerts, and other site specific and visitor information. Based on data from November 2018 to November 2019, we can estimate approximately 1,316,721 total app downloads occur per year. If we assume that one download represents one candidate session, this would correspond to approximately 112,000 candidate sessions across the sampling period. To recruit these users, in-app notifications will be programmed by HFC that will pop-up and invite the user to participate in the survey during an individual session. In-app notifications will be programmed to be randomly distributed to 29% of users during the study period. To receive an invitation, a user must have the app open on their device and be connected to Wi-Fi or data-connected cellular service. Once the respondent clicks the hyperlink, a survey instance will be initiated in their device’s default web browser. It is expected that gathering 292 completed questionnaires will require selecting approximately 29,200 app sessions to be shown a pop-up invitation (for more information, see Section D, Expected Response Rate/Confidence Level).
Table 2: Expected Invitations During the Sampling Period
Platform |
Candidate Instances |
Daily Invitations |
Total Number of Invitations |
% of Daily Sessions Contacted |
NPS.gov |
5,824,000 |
1043 |
29,200 |
0.5% |
HFC Apps |
101,009 |
1043 |
29,200 |
29% |
Social Media Channels: Social media account managers at 34 park units have been recruited to post, tweet, or pin invitations on park specific NPS social media accounts on behalf of the study team. Participating parks vary in their volumes of physical and virtual visitors and represent all seven of NPS’s administrative regions. In total, units included in the pre-test administration manage 33 Facebook profiles, 31 Twitter accounts and 29 Instagram pages with a combined 7.8 million followers. Metrics describing daily visitors across social media accounts included within the sample frame are not available to inform the potential rate of incidence; therefore, a main outcome of the pre-test will be to assess an appropriate cadence of posts and reminder posts on each of the three platforms to determine what level of promotion generates rates of engagement to meet the sample objectives of 292 completed surveys per social media platform.
The distribution of actual survey invitations will be executed at the park level by the individual resource manager according to the sample design. Based on consultation with website and social media account managers, posts and tweets will be carried out across the study period to test potential engagement using a varied invitation approach. During the pre-test, invitation spacing or frequency (i.e., the number of times a manager will post and repost the invitation) will initially be standardized to three posts per park-specific social media platform across the first 14 days of the sampling period. The invitation rate will then be adjusted based on observed response rate by social media platform over the remaining sampling period. Over the course of the pre-test, the study team will monitor engagement (number of clicks) and completions across each social media platform to establish approximate response rates and inform any adjustments to sampling methodologies for the full-scale study. These efforts during the pre-test will inform invitation strategy during the full implementation of the survey.
Overall, given a target number of 1,460 completed questionnaires, and an assumed completion rate of 1% across all platforms (See Section D: Expected Response Rate/Confidence Interval for more detail), the total number of visitor contacts across all three platforms will be 146,000 visitors.
Instrument Administration:
The survey will be centrally hosted and programmed using a customized and proprietary computer survey platform to implement and administer the instrument, independent of NPS digital resources included in the study. This system provides a graphical user interface and dynamic branching to improve the efficiency and ease by which respondents can navigate and complete the instrument. All respondents who elect to participate will enter the study via a platform-customized weblink which will direct the respondent from their recruitment platform to a survey landing page through their default web browser. This means that a survey respondent using NPS.gov will be redirected to the same survey platform as someone using Twitter, Facebook, or Instagram, which enables all data to be collected and stored on a single, secure server. The platform-specific weblinks will automatically branch each respondent to the correct platform-specific questions. Respondents will be able to take the survey on whatever device they were using when they were recruited for the survey, including smartphones, laptops, desktop computers, tablets, etc.
Per Dillman and Bowker’s (2000) principles for the design of web surveys, the landing page will be motivational, introduce the purpose of the study and the approximate time required to complete the survey, emphasize the anonymity of the data collection and the ease of responding, and instruct respondents on the action needed for proceeding to the next page to begin the survey. The survey will be pre-tested to ensure that the questionnaire is functional and aesthetically pleasing on different screen configurations, operating systems, browsers, and partial screen displays. Clear, specific instructions will be provided on each question to ensure that respondents understand how to take the computer action necessary to respond to each question (Dillman and Bowker, 2000).
The survey platform will be live for four weeks. A tracking page will be used to monitor survey results in real time, including survey starts, completions, and drop-off locations by platform type, to ensure that the survey is working as intended.
Expected Response Rate / Confidence Level:
The pre-test survey effort aims to contact a total of approximately 146,000 virtual visitors across the three platform types. Based upon other studies conducted by Resource Systems Group, Inc. (RSG), using similar prescribed sampling methods, as well as on the acceptance rate for similar surveys of NPS.gov virtual visitors, we anticipate that 1.3% of virtual visitors across platforms will click on the survey hyperlink, and 1% of all virtual visitors will complete the survey for a total target of 1,460 questionnaires.
Rates of response will likely vary between platform type:
NPS.gov: In the on-going NPS Web Monitor Survey that deploys a similar pop-up invitation method and online administration, it was observed that approximately 1% of web visitors invited to participate opened the hyperlink to initiate a survey, and of these more than 80% went on to complete the questionnaire. Based on this response rate and additional measures we will take to maximize response rate, including best-practice invitation and survey design, the anticipated completion rate among those invited to participate is at 1% for this collection. We plan to contact 29,200 visitors over the sampling period to meet our goal of 292 completed surveys.
Social Media: Generally, it is expected that response rates to social posts will be low; in similar research conducted by RSG, where survey recruitment has been handled through paid social media advertising, rates of response (the ratio of those viewed an invitation to those who completed a questionnaire) were generally below 1%. Because this survey is being promoted by NPS social media channels, rather than through paid posts, and targets virtual visitors who may already be interested in the NPS, it may be reasonable to expect somewhat higher rates of participation, but these benefits are not certain. A conservative completion rate of 1% is anticipated across social media platforms. As a result, we will aim to contact 29,200 virtual visitors per social media platform in order to receive 292 completed surveys per platform. Engagement and completions will be monitored throughout the course of the pre-test to inform any adjustments to sampling methodologies for the full-scale study.
HFC Apps: Based on comparable research conducted by RSG that has utilized trusted source email invitations, the completion rate is expected to mirror that of NPS.gov at approximately 1%.
Table 3 shows estimated response rates by platform type. Rate of completion is defined as the portion of respondents who are invited and complete the questionnaire. Partial non-respondents are those who accept the invitation but do not complete every question of the questionnaire. Rates of partial non-response will be assumed to be approximately 0.3% of all those invited. This rate is slightly higher than what was observed in NPS Web Monitor survey owing to the slightly longer survey length of this present effort. Non-respondents are those who are invited and do not click to start a survey.
Table 3: Estimated Response Rates by Platform Type
Platform Type |
Initial Contacts |
Completion 1% |
Partial Non-respondents 0.3% |
Non-respondents 98.7% |
NPS.gov |
29,200 |
|
|
|
29,200 |
|
|
|
|
29,200 |
|
|
|
|
29,200 |
|
|
|
|
HFC Apps |
29,200 |
|
|
|
TOTAL |
146,000 |
1,460 |
438 |
144,102 |
Differences in survey recruitment and response characteristics across the tested NPS digital resources map to this pre-test’s objectives. Sample targets, segmented by the platform type, are designed to achieve statistical precision for detecting the acceptability of the instrument design and to defend any necessary design changes prior to the full study. The assumptions to compute these targets were 1) an acceptable Type I error rate of 5%, 2) a Type II error rate of 20%, and 3) a detectable difference in proportional response characteristics of 10%. We will target a minimum survey completion of 292 records per platform.
Strategies for dealing with potential non-response bias:
The characteristics of the true population of virtual visitors is currently unknown in any NPS digital asset. As a result, controlling for non-response bias by comparing collected data to population targets will not be possible. However, features of the survey itself can help to mitigate potential sources of error. First, since the survey is of virtual visitors who are recruited online, there is no coverage error associated with entirely online administration. Second, the audience for this survey are virtual visitors, ages 18 and older, who will be recruited while visiting an NPS digital resource, meaning that most of those virtual visitors contacted will qualify to participate. This may result in higher rates of participation and lower non-response bias than a study conducted from a completely random sample frame. Third, studies conducted by government agencies will often have higher response rates than those conducted by commercial entities (Fan and Yan 2010), and previous studies that RSG has conducted with physical NPS visitor populations are associated with higher response rates than many other market surveys conducted by third-party organizations.
In addition, Dillman et al. (2014) has indicated that while higher response rates do not guarantee minimal nonresponse error, they do reduce the likelihood of nonresponse error and thus nonresponse bias. As a result, an effective strategy to reduce potential non-response bias is to maximize response rates from a representative random sample. Correspondingly, while we are assuming a conservative acceptance rate of 1% across all platforms, we will be implementing a number of common strategies that have been shown to be associated with improved response rates. For example, we will aim to post three recruitment messages on each form of social media; this number of contacts is associated with an increased response rate on web surveys, which can also help reduce non-response error (Cook et al. 2000, Dillman and Bowker 2000). Furthermore, response rates are likely to be heavily influenced by interest in the topic, or salience (Dillman and Bowker 2000); as we are contacting virtual visitors, we can assume that our potential respondents have at least some interest in the information being provided by the NPS on these platforms. Nonresponse on web surveys may also stem from mechanical issues, frustration with questionnaire design, or incompatibilities between the respondent’s device and the questionnaire design (Dillman and Bowker 2000, Vicente and Reis 2010, Atif et al. 2012); the extensive internal pre-testing of the survey instrument before the launch of the pre-test will help address the potential for these frustrations as well. Survey design choices, including the use of radio buttons, screen designs rather than scroll designs, and controlling for respondents’ perception of burden using a progress indicator, will also help minimize dropouts and increase response rates (Vicente and Reis 2010).
Click rates, drop-out rates, and response rates across all platforms will be monitored throughout the pre-test to inform refinements to recruitment and administration methods for the full implementation of the survey.
Description of any pre-testing and peer review of the methods and/or instrument:
The questionnaire was designed and reviewed by professional research staff at RSG at the PhD-level and MS-level with close coordination from NPS personnel at the Office of Communication and Social Science Program.
Pretesting of the pre-test questionnaire will occur with four administrative staff (e.g., H.R. representatives, accountants) in RSG’s White River Junction and Burlington VT offices, who do not possess experience with or knowledge of survey research or park management. The instrument will also be extensively tested by the professional survey development team who will design and program the survey for online administration. The survey will be tested to ensure proper data retrieval, storage, question branching and other survey features are functioning as intended.
BURDEN ESTIMATES:
The combined total burden for this collection is estimated to be 280 hours. We have estimated burden for the online questionnaire as follows:
Completed Surveys: 243 hours.
After the initial contact, it is anticipated that 1% of respondents across all platforms will complete the survey, which will take approximately 10 minutes.
Partial non-responses: 36 hours.
It is anticipated that of all initial contacts, 438 will be partial non-respondents. These are the respondents that accept the survey invitation but will not complete the questionnaire. It is estimated that those partial non-respondents will spend about five minutes on the questionnaire before dropping out.
It is expected that the remaining virtual visitors will completely refuse to participate and for those individuals there will be no calculation of burden.
Table 4. Burden Estimates
|
Responses |
Completion Time * (minutes) |
Burden Hours (rounded up) |
1,460 |
10 |
243 |
|
Partial non-response survey |
438 |
5 |
37 |
Total burden requested under this ICR: |
1,898 |
|
280 |
|
|
|
|
REPORTING PLAN:
The results of the pre-test will be presented in an internal report for park managers. Results of statistical analyses and summary statistics will be compiled (e.g., response frequencies, measures of central tendency, correlations, Chi-square, analysis of variance, factor analysis, and scale reliability analysis, as appropriate).
The information in the report will not be generalized or used beyond the scope of the pre-test. The pre-test will assess the suitability of the questionnaire and allow for any necessary refinement of the survey instrument and administration methods for the full implementation of the study; in other words, the results are intended to be used to inform any implementation of a full study surveying virtual visitors.
Final reporting will be delivered to park managers in hard copy and electronic formats and posted as a Natural Resource Data Series in the NPS Data Store (https://irma.nps.gov/DataStore/Reference/Profile/) as required by the NPS Programmatic Review Process.
LITERATURE CITED:
Atif, A., Richards, D., & Bilgin, A. (2012, December 3). Estimating Non-Response Bias in a Web-Based Survey of Technology Acceptance: A Case Study of Unit Guide Information Systems. ACIS 2012: Proceedings of the 23rd Australasian Conference on Information Systems.
Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web- or internet-based survey. Educational and Psychological Measurement, 60(6), 821–836.
Dillman, D. A., & Bowker, D. K. (2000). The web questionnaire challenge to survey methodologists. Online social sciences, 53-71.
Dillman, D., Smyth, J., & Christian, L. (2014). Internet, Phone, Mail and Mix-Mode Surveys: The Tailored Design Method. New Jersey; John Wiley & Sons.
Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132–139. https://doi.org/10.1016/j.chb.2009.10.015
Vicente, P., & Reis, E. (2010). Using Questionnaire Design to Fight Nonresponse Bias in Web Surveys: Social Science Computer Review, 28(2), 251–267. https://doi.org/10.1177/0894439309340751
NOTICES
Privacy Act Statement
General: This information is provided pursuant to Public Law 93-579 (Privacy Act of 1974), December 21, 1984, for individuals completing this form.
Authority: National Park Service Research mandate (54 USC 100702)
Purpose and Uses: This information will be used by The NPS Information Collections Coordinator to ensure appropriate documentation of information collections conducted in areas managed by or that are sponsored by the National Park Service.
Effects of Nondisclosure: Providing information is mandatory to submit Information Collection Requests to Programmatic Review Process.
Paperwork Reduction Act Statement
We are collecting this information subject to the Paperwork Reduction Act (44 U.S.C. 3501) and is authorized by the National Park Service Research mandate (54 USC 100702). This information will be used by The NPS Information Collections Coordinator to ensure appropriate documentation of information collections conducted in areas managed by or that are sponsored by the National Park Service. All parts of the form must be completed in order for your request to be considered. We may not conduct or sponsor and you are not required to respond to, this or any other Federal agency-sponsored information collection unless it displays a currently valid OMB control number. OMB has reviewed and approved The National Park Service Programmatic Review Process and assigned OMB Control Number 1024-0224.
Estimated Burden Statement
Public Reporting burden for this form is estimated to average 60 minutes per collection, including the time it takes for reviewing instructions, gathering information and completing and reviewing the form. This time does not include the editorial time required to finalize the submission. Comments regarding this burden estimate or any aspect of this form should be sent to the Information Collection Clearance Coordinator, National Park Service, 1201 Oakridge Dr., Fort Collins, CO 80525.
RECORDS RETENTION -
PERMANENT.
Transfer all permanent records to NARA 15 years after
closure. (NPS Records Schedule, Resource Page
Management And Lands (Item 1.A.2) (N1-79-08-1)).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Tristan Cherry |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |