1024-0216 Ssb 772017

1024-0216 SSB 772017.docx

National Park Service Visitor Survey Card

OMB: 1024-0216

Document [docx]
Download: docx | pdf

Supporting Statement B


National Park Service Visitor Survey Card


OMB Control Number 1024-0216



  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Each year, approximately 130,000 visitors will be asked to participate in this study. Four hundred Visitor Survey Cards will be systematically distributed to each of the 330 National Park Service (NPS) units. The survey content and methodology, described below, is similar to that used in previous survey efforts since 1998 and updated in 2014.


Respondent Universe

The respondent universe for this collection will be all adults 18 years of age or older visiting a NPS unit during designated sampling periods. Visitors will be randomly selected to participate in the study.


Visitor Selection

A random sample of visitors will be intercepted at the end of their visit at designated locations within the park. Sampling will typically occur during peak visitation periods. During the sampling periods, visitors will be asked to complete the Visitor Survey Card (VSC) on-site or they will be given the option to return the survey by mail.


All visitors refusing to participate in the study will be asked to respond to three non-response bias questions (see Item 2 below). These responses will be used to evaluate any non-respondent bias.


Expected Response Rate

Based on experience from previous collections, we expect to receive 71,500 annual responses (55% response rate). This response rate is based up the results of the FY 2016 sample which is 20% higher than previous years. This sample size is considered to be robust enough at both individual and national level parks to produce results with acceptable margins of error.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The distribution of the visitor survey cards will be evenly divided between weekends and weekdays, and between two blocks of time, 8:00 a.m. to noon and 1:00 p.m. to 5:00 p.m. Locations, days, dates, and times will be determined on a park-by-park basis. The protocols for the distribution of survey cards call for contacting visitors on a frequency of every nth person or vehicle. This protocol is to be followed with the following exception:


  • When vehicular or foot traffic is heavy - the surveys will be spread out over the entire four hour period. Instead of sampling every nth person or vehicle, a time interval of every three minutes or five minutes between parties will be used.


All parks participating in the VSC will be assigned a survey month between February 1 and August 31. In most cases, this will be a month during the park’s peak visitation season. Each park unit will receive 400 surveys. With a response rate of at least 50% we expect to receive an average of 200 returned surveys per park unit. The associated margin of sampling error is +/- 8% at the 95% confidence level for each park.


The corresponding margin of sampling error for the combined national dataset is less than +/- 1% at a 95% confidence level. This is desirable for reporting System-level performance with respect to national GPRA goals.


The survey cards will be distributed at locations with high concentrations of visitors such as popular trail heads, visitor centers, and campgrounds. The survey coordinator in each park will randomly schedule a minimum of eight survey days (each having a four-hour time period) during the month. Sampling days are stratified so that four are weekdays and four are weekends. In addition, equal numbers of mornings and afternoons will be included in the schedule. If an increase in sampling days is needed (based on individual circumstances such as weather, visitor attendance fluctuations, and construction), parks will be instructed as to how to add days to the minimum of 8 to ensure a representative sample of park visitors.


Initial visitor contact will be made by NPS employees or uniformed park volunteers in each of the chosen NPS units. This contact will take approximately one minute. The responses to the non-response questions will take an additional minute.


Hello, my name is . The National Park Service is conducting a brief survey of our visitors. We would like to understand how satisfied you were with the services and facilities here at [insert Park name]. Your participation is voluntary and this will take about three minutes of your time. You can complete the survey now or you can return it later by mailing it to us. Would you be willing to help us by filling out this short survey?”


If the visitor’s response is “Yes,” the surveyor will hand the survey card and a pencil to the visitor.


Thank you for agreeing to take a survey. Our preference is that you take the time to complete the card before you leave the park today and put it in this drop box. However, if you would like, you can mail it back to us by putting it in any US mailbox when you are done. We will receive it at the address printed on the back. Thank you and have a great day.


If the visitor’s response is “No,” the surveyor will continue by asking the visitor to complete the non-response survey


I understand that you may not have time to complete and return the full version of the survey. However you be willing to answer three brief questions that will only take about one minute?


If the visitor’s response is “Yes,” the surveyor will ask the following three questions:


1. If you were to rate overall quality of facilities, services, and recreational opportunities you experienced here today at (full park name): would you say they were very good, good, average, poor, or very poor?


      • Very Good

      • Good

      • Average

      • Poor

      • Very Poor?

2. What is your zip code?


3. In what year were you born?


The surveyor will record the gender of the respondent on the log sheet.


Thank you and have a great day.


If the visitor’s response is “No,” to the survey and the non-response survey prompt, the surveyor will record the gender of the visitor on the log sheet and attempt to determine the reason for the refusal.


Would you mind if I ask you why are not able to participate today? [the surveyor will record response].

Thank you and have a great day.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


The following steps will be taken to maximize response rate and ensure an accurate and reliable sample at all collection sites.


Multiple options for completing the survey

Providing multiple locations (in addition to the drop boxes) to return the survey on-site has increased our response rate from 27% to 50%. The option to fill out the survey on site has shown that the majority of the respondents tend to use the on-site drop boxes to return their surveys or. Having knowledgeable, uniformed staff or volunteers wearing NPS insignia has also helped to increase the legitimacy of the survey.


Addressing potential non-response bias

We will use zip codes to calculate the distance traveled to the park to check non-response bias. Participants’ gender and age will also be used to help determine any bias due to non-response. If a non-response bias is found, the data will be weighted to reduce the effect of non-response bias.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The questions in the current version of the VSC have been tested and approved annually by OMB since 1998. The script and process for the interviewer was tested for clarity and accuracy in recording responses using a random sample of < 9 people entering the Washington State Social & Economic Sciences Research Center. The same volunteers were also given the current version of the VSC and were asked to review and provide comments concerning the overall structure, sequence and clarity of questions, and to estimate the time burden of the survey. The feedback did not provided suggestions for changes to the instrument or design.


The NPS Social Science Branch also elected to update the form by adding two additional questions from the currently approved NPS Pool of Known Questions (OMB Control Number 1024-0224).



5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Pacific Consulting Group

643 Bair Island Road, Suite 212

Redwood City, CA 94063

4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPSU PSU
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy