1024-0224

1024-0224 NPS ProgrammaticReview_BRVB_NPS.4-12-2017.docx

Programmatic Clearance Process for NPS-Sponsored Public Surveys

1024-0224

OMB: 1024-0224

Document [docx]
Download: docx | pdf

Shape6 Shape5

OMB Control Number: 1024-0224 Current Expiration Date: 5/31/2019


National Park Service

U.S. Department of the Interior


Shape7

Do not revise any part of this form



Shape2 Shape1

Programmatic Clearance Process

for NPS-Sponsored Public Surveys






Shape3



The scope of the Programmatic Review and Clearance Process for NPS-Sponsored Public Surveys is limited and will only include individual surveys of park visitors, potential park visitors, and residents of communities near parks. Use of the programmatic review will be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process. Additionally, this process is limited to non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Examples of significant, sensitive, or political issues include: seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic National Wildlife Refuge.



Submission Date:

4-11-2017


Project Title: Understanding Visitors to Brown v Board of Education National Historic Site


Abstract (not to exceed 150 words)

Social interactions and interpretation may impact the visitor experience at cultural and historic sites and alternatively, may impact the local economy. With an increased use of parks, protected areas, and historical/cultural sites, understanding the social impacts of this increased use, is of critical importance in order to maintain the value of Brown v Board of Education and parks like it. This study was designed to collect data about levels, types, patterns, and impacts of visitor experience at Brown v Board of Education in Kansas, and thus, help inform the ongoing and future planning process.


Principal Investigator Contact Information

Name:

Ryan Sharp

Title:

Assistant Professor – Park Management & Conservation

Affiliation:

Kansas State University

Address:

2021 Throckmorton Plant Sciences Center

Manhattan, KS 66506

Phone:

785-532-1665

Email:

[email protected]


Park or Program Liaison Contact Information

Name:

Sherda Williams

Title:

Deputy Superintendent

Park:

Brown v. Board of Education National Historic Site

Address:

1515 SE Monroe St.

Topeka, KS 66612

Phone:

785-354-4273 x231

Email:

[email protected]


Project Information


Where will the collection take place? (Name of NPS Site)

Brown v Board of Education (BRVB)




Sampling Period

Start Date: May 2017

End Date: October 2017




Type of Information Collection Instrument (Check ALL that Apply)


Mail-Back Questionnaire

Face-to-Face Interview

Focus Groups


On-Site Questionnaire

Telephone Survey



X Other (list) on-line Questionnaire


Will an electronic device be used to collect information?

No X Yes - type of device: Tablet and personal (at home) computers




Survey Justification:


Social science research in support of park planning and management is mandated in the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). The NPS pursues a policy that facilitates social science studies in support of the NPS mission to protect resources and enhance the enjoyment of present and future generations (National Park Service Act of 1916, 38 Stat 535, 16 USC 1, et seq.). NPS policy mandates that social science research will be used to provide an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources. Such studies are needed to provide a scientific basis for park planning and development.

Public land management occurs in a complicated environment that bridges social and historical factors. While scientists and managers usually make decisions based on scientific evidence, visitors and stakeholders often respond to issues based on emotional attachments. Consequently, identifying visitors’ perceptions and attitudes towards current issues, currents uses or potential management action is critical to anticipate public response and decrease opportunities for conflict. Managers’ new knowledge of visitors’ opinions provided by this research can directly inform the design of interpretation and public outreach in a very intentional and prescriptive manner. Interpretation and outreach can then be used to influence beliefs, attitudes, and social norms. Addressing and influencing these elements through interpretation can ultimately alter stakeholders’ acceptance of management policies.

The overall purpose of this proposed collection is to gather baseline information that will help support visitor use management (VUM) planning at Brown v Board of Education (BRVB). This collection is intended to inform and guide NPS managers in providing sustainable and appropriate visitor experiences in an overall visitor use planning effort the historic site. This study has several objectives:

1) Evaluate visitor perceptions of the frequency, type, density, and temporal and spatial distributions of visitor use at BRVB within and across seasons;

2) Determine the relationships between use patterns and historic and social conditions based on visitor perceptions; and

3) Understand the economic impact of visitors to BRVB, as well as BRVB visitors on Topeka and the surrounding community.

Study findings will be used to:

  • Provide baseline data for understanding current conditions and for comparison to future monitoring efforts;

  • Update existing plans where appropriate [e.g., General Management Plan, Development Concept Plan, Interpretation and Visitor Experience Plan]; and

  • Inform potential management action aimed at maintaining high-quality visitor experiences and protecting cultural resources. Potential actions informed by this project may include but are not limited redistributing use, infrastructure improvement, interpretation design and delivery, and public communication.

BRVB managers requested this information collection to inform a larger planning process. Data will be used to accurately gauge experiences and economic impacts in Topeka and the surrounding area. This will help the park work with community members to plan a more total experience in the city if it is found to have an impact. BRVB managers are interested in having reliable visitor statistics and feedback to address operations and planning issues concerning the ongoing planning process based on park managers input and due to suspected social issues and economic impact.

To this end, an online questionnaire will be used to collect the specific information from visitors from the entry way intercept location at the site.

  • Individual characteristics

  • Trip/visit characteristics

  • Individuals’ perceptions of park’s management actions

  • Users’ perceptions of historic events

  • Users’ perceptions of social issues


Survey Methodology

  1. Respondent Universe:

The respondent universe for this collection will be a systematic sample of visitors, age 18 and older, visiting the historical site during the study periods (May 2017 - October 2017). According to the NPS visitor use statistics, approximately 25,000 people visited the park in 2015.


  1. Sampling Plan/Procedures:

Sampling will occur at the intercept site as the entry way upon visitor exit to the historical site from 9 am to 5 pm during the following days May 25-29, June 19-23, July 1-5, August 22-26, September 1-5 and October 24-28) for a total of 30 sampling days. On each sampling day, a trained surveyor will be stationed at the entrance of the site (i.e. foyer). Table 1 provides the estimated number of visitors expected, approached, and expected to agree to participation of the online survey, based on BRVB 2016 visitation data.


Table 1. Expected Monthly Participation



Estimated Park Visitation by Month


Average Number of visitors per week

Targeted Number of

visitors approach

per sampling period

Expected

on-site Acceptance Rate

75%

Total Expected number of People Agreeing to Participate

May

3,256

344

160

75%

120

June

2,631

526

250

75%

187

July

2,735

547

250

75%

187

August

2,643

527

250

75%

187

September

1,688

338

160

75%

120

October

1,722

344

160

75%

120

Total

14,675

2,626

1,230


921



Although the number of people contacted will vary depending on the day of the week, depending upon weather or special events during the sampling period, we plan to collect approximately 921 names and email address of visitors who agree to complete the online survey, over the course of the sampling period.



  1. Instrument Administration:


On-site Contact

The initial contact with all visitors will be used to explain the study. This should take approximately one minute. At this point, all individuals approached will be asked the non-response bias questions to collect information that will be used in the final analysis (see item E below). If visitors are interested in participating (see attached script), the survey interviewer will ask the person (or individual within the group who has the next birthday) serving as the respondent for the study to provide their contact information (name and email address). The visitors that refuse will be thanked for their time and willingness to answer the non-response bias questions. The number of refusals will be recorded and used to calculate the overall response rate for the collection.


Visitors selected for participating in the survey will be read the following script:


Hello, my name is _________. I am conducting a survey for the National Park Service to understand more about your experiences in the park today. The answers you provide based on your perception of the park will help inform future management actions. Would you be willing to answer a few questions? (ask non-response questions) Your participation is voluntary and all responses will be kept anonymous.


If No– the interviewer will thank them for time and end contact.

If YES – then ask, “have you (or – has any member of your group) been asked to participate in this survey before?”

  • If “YES” (already asked to participate) then, “Thank you for agreeing to participate in this study. Have a great day.”

  • If “NO” (have not been previously asked to participate) then, “Thank you for agreeing to participate. Are you at least 18 years old (or- who in your group is at least 18 years old and has the next birthday)? Ask them to start the process by answering the non-response bias questions (listed below). Record responses in spaces provided on a tablet computer. Would you please provide your name and email address? You will receive an email with the full survey in about one week. Once you begin it will take no more than 15 minutes to complete. All of your responses will be completely anonymous.


On-line Survey

The online survey will be administered one week after the end of each data collection period and sent to the emails provided by visitors on-site. Qualtrics (an online survey platform) will be used to manage the email distribution of the surveys and a standard Dillman Method of on-line administration will be used. We will send the initial email contact to the visitor with an explanation of the survey and a reminder that they provided their email address a week previous during their visit to BRVB. Also in the email will be a link to the online, anonymous and confidential survey. Once the link is clicked, the visitor will be taken to the online survey, which will take less than 15 minutes to complete. Two additional follow-up emails will be sent to visitors who did not complete the online survey in response to the original email. Only those who did not complete the survey will be sent a reminder. The first reminder email to complete the survey will be sent out 2 weeks after the original email and the second reminder email will be sent out 2 weeks after the first reminder email. This method will be used to ensure the best possible response rate. If the visitor chooses not to complete the online survey at any point of the email reminders, they will be asked a small subset of questions from the full survey to test online non-response rates (see item E below).


(d) Expected Response Rate/Confidence Levels:

The response rate for this collection is based on surveys at similar park sites using the same methodology. Surveys that utilize a mixed method of intercepting visitors on site followed by email, are more likely to complete/return the online survey. We had a similar response rate for a similar study at Great Sand Dunes National Park; although visitors to GRSA may not be exacatly the same as BRVB visitors, the demographics at BRVB appear to be similar to other NPS units (based on manager feedback).


The expected response rate for the online survey is based upon the anticipated total number of on-site visitors contacted (n= 1,230). We expect that 75% (n=921) will agree to participate in the on-line survey by providing their contact information during the initial contact. Of all of the visitors contacted 25% (n=308) will refuse to participate, however of the 308 refusals at least 20% (n=62) will answer the non-response survey questions (Table 2).


Table 2. Response Rate of On-site Acceptance and Refusal


Response rates based upon total estimated

number of all visitor contacted


Onsite

Initial

Contacts

Acceptance

75%

Refusals

25%

Non-response survey

20%

Hard Refusals

80%

Total

1,230

921

308

62

246





% refusals



Based upon the number of people accepting the invitation to complete the survey (n=921) we anticipate that at least 50% (n=460) will complete the on-line survey. This also includes the number of people responding to the follow-up reminders. From all of the non-respondents (n=461) we expect that 40% (n=184) will complete the non-responseon-line survey (Table 3).


Table 3. Response Rate for Completed On-line Surveys.


Completed On-line Surveys


Online Survey

Number of On-line surveys sent


Completed online surveys

50%

All

Refusals

Non-response survey

40%

Non-respondents/ Hard Refusals

60%

Total

921

460

461

185

277


Confidence intervals provide a method for estimating population values from sample statistics. Based on the survey sample sizes, there will be 95 percent confidence that the survey findings will be accurate to within 5 percentage points. The proposed sample size will be adequate for bivariate comparisons and more sophisticated multivariate analysis. For dichotomous response variables, estimates will be accurate within the margins of error and confidence intervals will be somewhat larger for questions with more than two response categories.


  1. Strategies for dealing with potential non-response bias:


On-site

During the initial contact, the interviewer will ask each visitor four questions taken from the survey. These questions will be used in a non-response bias analysis.


1) What type of group are you traveling with today?

2) How many people are in your group?

3) How did this visit to fit into your travel plans? (i.e., primary destination, one of several destinations, or not a planned destination).

4) How old is the person who will complete the questionnaire?


Responses will be recorded on a log for every survey contact. Results of the non-response bias check will be described in a report and the implications for park planning and management will be discussed.


1) What type of group are you traveling with today?

2) How many people are in your group?

3) How did this visit to fit into your travel plans? (i.e., primary destination, one of several destinations, or not a planned destination).

4) How old is the person who will complete the questionnaire?


On-line

The online survey will have the option to not continue with the survey, the visitor will then be asked to answer the same four questions (see below) taken from the survey, as the visitor who refused on-site. These questions will be used in a non-response bias analysis.


  1. Description of any pre-testing and peer review of the methods and/or instrument (recommended):


The questionnaire format and many of the questions have been used in many survey instruments previously approved by OMB. The questions are taken from the currently approved list of questions in NPS Pool of Known Questions (OMB 1024-0224; Current Expirations Date: 5-31-2019). Variations of the questions have been reviewed by BRVB managers and university professors. The questionnaire was tested on eight voluntary members of the general public for burden length and clarity of the questions. Feedback from the volunteers was incorporated in the final questionnaire.



Burden Estimates

Overall, we plan to approach a total of 1,230 individuals during the sampling periods. We expect that the initial on-site contact time with all individuals approached, will be at least one minute per person (1,230 x 1 minute = 20.5 hours). We anticipate that 921 individuals will agree to to complete the online survey during the initial on-site contact. Of all the visitors refusing to accept the invitation (n=309) we will ask if they would be willing to to answer the four questions that will serve as the non-response bias check. We expect that 20% (n=62) of the on-site refusals will answer the non response bias questions leaving 247 visitors refusing to accept any part of the invitation to participate; and for those individuals, we record their reason for refusal.

Table 4. Estimated Respondent Burden for BRBV On-Site Contact

Estimated Total Number


Estimation of Time (minutes)


Estimation of Burden (hours)

Initial Contacts

1,230


Initial Contact

1


Initial Contact

21

Nonresponse Bias Survey

62


Nonresponse Bias Survey

1


Nonresponse Bias Survey

1

Total Number On-site Acceptance

921


To provide email and contact information

1


To complete response

15







Total Hours

37


For those who agree to participate (n=921), we expect that 50% (n=460) will submit a completed version of the online survey. An additional 15 minutes will be required to read the instructions, complete, and return the questionnaire (460 responses x 15 minutes = 115 hours). Of the 461 non-respondents, we have estimated that 20% (n=92) will take one minute to complete the non-response survey (92 x 1 minute = 1.5 hours). The burden for on-line survey to be 117 hours.


Table 5. Respondent burden for BRBV On-line Survey

Estimated Total Number


Estimation of Time (minutes)


Estimation of Burden (hours)

Completed Surveys

460


Completed Surveys

15


Completed Surveys

115

Nonresponse Bias Survey

92


Nonresponse Bias Survey

1


Nonresponse Bias Survey

2


552





Total Hours

117


The total estimated respondent burden for this collection is estimated (initial contact + acceptance + completed surveys + nonresponse bias) to be 154 hours.


Total for all Responses


Estimated burden hour

On-site burden including initial contact and non-response survey

37

Completeed Surveys

117

Total

154



Reporting Plan


The study results will be presented in internal agency reports for NPS managers at the park. Response frequencies will be tabulated and measures of central tendency computed (e.g., mean, median, mode, as appropriate). The reports will be archived with the NPS Social Science Program for inclusion in the Social Science Studies Collection as required by the NSP Programmatic Approval Process. Hard copies will be available upon request.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCPSU
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy