1024-0224 Programmatic Form - SEM 1.0

1024-0224 - SEM1.0 4-2-2015.docx

Programmatic Clearance Process for NPS-Sponsored Public Surveys

1024-0224 Programmatic Form - SEM 1.0

OMB: 1024-0224

Document [docx]
Download: docx | pdf

Shape3

Shape1

N ational Park Service

U.S. Department of the Interior


Social Science Program





OMB Control Number 1024-0224

Current Expiration Date:8-31-2014

Shape2 Programmatic Approval for NPS-Sponsored Public Surveys



Submission Date

4-2-2015


1.

Project Title:

NPS Socioeconomic Monitoring Program (SEM):Phase 1
















2.

Abstract:

The purpose of this project is to conduct 20 surveys to test the questions and procedures that will be used to develop a final survey for the service-wide NPS socioeconomic monitoring program (SEM).



(not to exceed 150 words)

3.

Principal Investigator Contact Information

First Name:

Steve

Last Name:

Lawson

Title:

Director, Public Lands Planning and Management

Affiliation:

Resource Systems Group, Inc.

Street Address:

55 Railroad Row

City:

White River Junction

State:

VT

Zip code:

05001

Phone:

802-295-4999

Fax:

802-295-1006

Email:

[email protected]


4.

Park or Program Liaison Contact Information -

First Name:

Bret

Last Name:

Meldrum

Title:

Branch Chief

Park:

NPS Social Science Branch

Street Address:

1201 Oakridge Drive

City:

Fort Collins

State:

CO

City:

80525

Phone:

970-267-7213

Fax:


Email:

[email protected]



Project Information


5.

Park(s) For Which Research is to be Conducted:

This collection is designed to be administered at 20 NPS units (to be determined) that will be systematically selected to cover each of the NPS regions and a range of NPS unit types (e.g., national parks, historical parks, recreation areas).




6.

Survey Dates:

May 1 2015

December 31, 2016





7.

Type of Information Collection Instrument (Check ALL that Apply)



Mail-Back Questionnaire

On-Site Questionnaire

Face-to-Face Interview

Telephone Survey

Focus

Groups



Other (explain)




8.

Survey Justification:

(Use as much space as needed; if necessary include additional explanation on a

separate page.)

Social science research in support of park planning and management is mandated in the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). The NPS pursues a policy that facilitates social science studies in support of the NPS mission to protect resources and enhance the enjoyment of present and future generations (National Park Service Act of 1916, 38 Stat 535, 16 USC 1, et seq.). NPS policy mandates that social science research will be used to provide an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources. Such studies are needed to provide a scientific basis for park planning, management, and policy.


The National Park Service is well known for its survey research in national parks throughout the nation. Most of the surveys are conducted on a park-by park basis. From the concern that the current collection of data could not be systematically rolled up to create a long-term assessment of national trends characteristics, and behaviors of human populations in national parks the socioeconomic program (SEM) was developed.


The justification for a service-wide socioeconomic monitoring program is identified in the NPS strategic goals for science, an external review of the NPS Social Science Program and in the 2008 Department of the Interior Appropriations Bill Joint Explanatory Statement.


For the purpose of this request we would like to gather similar information using a single instrument at approximately 20 NPS units. This information will include: trip characteristics, spending in gateway communities, perceptions of park experiences, attitudes toward park management, and satisfaction about park services and facilities. The information collected will be used to develop a final survey and sampling methods that will be used by the socioeconomic monitoring program to administer surveys at selected parks. The purpose of this request is test the information collection process at 20 parks to validate the following:

  • Survey content and format

  • Sampling methods and design, and

  • Sampling frames



9.

Survey Methodology: (Use as much space as needed; if necessary include additional explanation on a

separate page.)

  1. Respondent Universe:

The respondent universe for the surveys in this submission will be all recreational visitors, age 18 and older visiting the selected NPS unit during the sampling period.


  1. Sampling Plan/Procedures:

A systematic sampling procedure will be used to intercept every kth visitor group to participate in the study. The sampling interval used at each NPS unit will be based on the unit’s 2013 Visitation Use Statistics (http://www.nature.nps.gov/assets/redirects/statsRedirect.cfm).


To prepare a sampling plan, NPS units were organized into one of three categories, based on previous research about NPS unit types (Rookey, Le, Littlejohn & Dillman, 2012). The categories include: 1) National Parks, which are large, natural resource-based parks; 2) Recreational Parks, which include National Recreation Areas, Seashores, Lakeshores and other units with a primary focus on recreation-related activities; and 3) Historical Parks, which have a primary focus on history and culture. Based on this categorization of NPS units, our sampling plan includes the following:


Site Location

Number of sites

Total # of contacts per site

National Parks

5

390

Recreational and Historical Parks

15

438


The survey design and sampling plan for the collections are based on Dillman's (2010) Tailored Design Method (TDM). The methodology has been used in more than 250 surveys conducted by the NPS Visitor Services Project (VSP). All of the questions in this collection and any variations are noted appear in the current NPS Pool of Known Questions (OMB 1024-0224; Current Expiration Date: 8-31-2014).


  1. Instrument Administration:


Each interviewer will be instructed to contact every kth visitor group at the designated intercept locations at each of the NPS units. Each interviewer will be trained on every aspect of on-site surveying including: using sampling intervals, avoiding sampling bias, and how to handle all types of interviewing situations, especially safety of the visitor and the interviewer. Quality control will be ensured by monitoring interviewers in the field, and by checking their paperwork at the end of each survey day.


The initial contact with visitors will be used to explain the study and determine if visitors are interested in participating. This should take approximately 1 minute per selected group. When a group is encountered, the survey interviewer will approach an adult in the group to request participation. All contacted visitors, including those who refuse to participate in the survey upon this initial contact, will then be asked to respond to a set of non-response bias questions (listed below, in item 9e). The interviewers will record observable information (i.e., current time, group size) on the survey log and non-response bias form, whether or not they agree to participate or even to answer the non-response bias questions. Visitors that decline to participate in the study will be thanked for their consideration. The number of refusals will be recorded and used to calculate the overall response rate for the collection at each park unit and overall.


Groups that agree to participate in the survey will be asked to identify the adult in the group who will have the next birthday to serve as the respondent. That individual will then be given a mail back survey packet and asked to provide or personally record his or her name, address, phone number, and email address on the survey tracking sheet – this information will only be used to follow-up with non-respondents who agree to participate but do not promptly return the completed questionnaire. At the end of the survey sampling period, all visitors accepting a survey packet on-site will be mailed a thank you/reminder postcard within 11 working days. A reminder letter, replacement questionnaire, and postage-paid return envelope will be sent to non-respondents 21 working days after completion of on-site contacts.


Visitors who are contacted for the information collection in each NPS unit will be read the following script:


Hello, my name is _________. I am conducting a survey for the National Park Service to help managers and planner understand your opinions about the programs and services here. Your participation is voluntary and all responses will be kept anonymous. Would you be willing to take a questionnaire and mail it back to us using the postage-paid envelope?”


If YES – then ask, “has any member of your group already participated in this survey?

If “YES” (previously agreed to participate) then, “Thank you for agreeing to participate in this study, we hope that you will return the questionnaire soon, if you have not already. Have a great day.”

If “NO” (have not previously agreed to participate) then,

Thank you for agreeing to participate. Most of our questions are in the mail-back survey, but I do have a few questions I need to ask you now.” [The surveyor will ask them to start the process by answering the non-response bias questions (listed in item 9e). The responses will be recorded in spaces provided on the survey log and non-response bias form. The surveyor will hand them a survey packet including the questionnaire and a self-addressed stamped envelope].

If NO– (soft refusal) then, “That’s fine; we won’t bother you with the mail-back survey. But would you be willing to take just one minute and answer a couple of questions for me now, to help us be sure our sample is reliable?” [The surveyor will record responses in spaces provided on the tracking sheet and then thank them for their time].

If NO– (hard refusal) – “Thank you for your time. Have a great day.”

  1. Expected Response Rate/Confidence Levels:

Although the same methodology and survey procedures will be used at each of the 20 NPS units, the response rates are expected to vary by unit type (Rookey, Le, Littlejohn & Dillman, 2012). In particular, response rates in national parks are expected to be higher than those in recreational or historic parks; thus, expected response rates and confidence levels are described here by park type for the 20 NPS units.


National Parks: A total of 390 visitors will be contacted during the sampling period. Based on similar studies conducted by the NPS VSP, it is expected that 350 (90%) visitors will agree to participate in the survey. It is expected that 228 visitors (65%) will complete and return the survey by mail. The target sample size for each national park is the same; park visitation data will be used to weight resultant survey data proportional to park visitation.


Recreational and Historical Parks: These survey will be administered at a total of 15 recreational and historical parks, and at each park, a total of 438 visitors will be contacted during the sampling period. Based on similar studies conducted by the NPS VSP, it is expected that 350 (80%) visitors will agree to participate in the survey. recreational and historical park is the same; park visitation data will be used to weight resultant survey data proportional to park visitation.


Based on the survey sample sizes, there will be 95% confidence that the survey findings will be accurate to within 7 percentage points, regardless of park type (Fowler, 1993). Thus, the proposed sample sizes will be adequate for bivariate comparisons and will allow for comparisons between study sites and more sophisticated multivariate analysis. For dichotomous response variables, estimates will be accurate within the margins of error and confidence intervals will be somewhat larger for questions with more than two response categories.





  1. Strategies for dealing with potential non-response bias:

During the initial contact, the interviewer will ask each visitor four questions taken from the survey. These questions will be used in a non-response bias analysis.


  1. Overall, how would you rate the quality of the facilities, services, and recreational opportunities in [NPS site]?

Very good Good Average Poor Very Poor

  1. On this trip to [NPS site], what type of group were you with?

Alone Friends Family Family and friends

Other (Please specify) ____________________________

  1. On this trip, how much time did you spend visiting [NPS site]?

_____ Number of hours, if a day trip

_____ Number of days, if greater than 1 day

  1. Did anyone in your group have a physical condition that made it difficult to access or participate in park activities or services, on this trip to [NPS site]?

Yes No


All responses will be recorded for every survey contact, except for those refusing to participate in the study or to answer the non-response bias questions. Results of the non-response bias check will be reported and any implications for applicability of survey results to generalizations about the study population will be discussed.


  1. Description of any pre-testing and peer review of the methods and/or instrument (recommended):

The survey questions are taken from the currently approved list of questions in NPS Pool of Known Questions (OMB 1024-0224; Current Expiration Date: 8-31-2014). The questionnaire format and many of the questions are similar to that used in more than 250 previous NPS VSP survey instruments. Variations of the questions have been reviewed by NPS managers, professors at Colorado State University and University of Wyoming, and PhD-level NPS survey research consultants at RSG.


In addition, five volunteers were asked to complete the survey instrument, and answer a series of debriefing questions. We also asked for feedback on the respondent burden, quality and clarity of the survey instrument and instructions, and ways to minimize respondent burden. Participants were also asked to indicate if they had any difficulty or confusion with multi-item response scales and/or instructions for recording responses (e.g., “Mark one” or “Mark all that apply”).


The feedback from the pre-test participants was unanimously very positive; participants reported that the layout of the survey instrument and question wording were straightforward, which helped to minimize respondent burden. Participants reported that they had no trouble with skip patterns, multi-item response scales, or other instructions for recording responses. The pre-test administrator also inspected the completed survey instruments, after the pre-test was concluded and found that there were no systematic response errors or item non-response patterns.


The time it took each pre-test participant to complete the survey instrument averaged 19 minutes (Table 1). This finding helps to validate the burden hour estimates and suggests that participation in the study does not cause undue/excessive respondent burden.


Table 1. Pre-test participants’ completion times.

Respondent

Completion Time

#1

13 minutes

#2

16 minutes

#3

20 minutes

#4

21 minutes

#5

23 minutes

Average

19 minutes


Participants in the pre-test offered the following minor suggestions to improve the wording or format of specific questions in the survey instrument, as follows:

  1. Use more specific wording when asking the number of people in their vehicle, similar to how group size is asked (i.e., “including yourself”).

  2. Add a “Don’t know/not sure” response to spending-related questions for those respondents who are unsure of how much they spent on specific items.

  3. Add an option in the amount of education completed by group members for “Less than high school,” specifically for young children within the group.


The survey instrument was revised to address each of the above comments and three spending-related questions were also added.


10

Burden Estimates:

National Parks:

  • A total of 390 visitors will be contacted at each site during the sampling period (390 visitors x 15 sites = 5,850 contacts).

  • It is expected that 1,509 (18%) visitors will completely refuse to participate. When possible we will use two minutes to record non response information (1,509 visitors x 2 minutes =50 hours).

  • The initial contact time for the visitors agreeing to participate (n=4,341) it is expected to be one minute, with an additional two minutes needed to ask the four for non-response bias check questions (4,341 participants x 3 minutes = 217 hours).

  • Based on similar studies conducted by the NPS VSP, it is expected that 65% (n=2,821) will spend the 20 minutes to complete and return the survey by mail (2,821 mail back x20 minutes = 940 hours).

  • The total burden for the national sample is estimated to be 1,207 hours.


Recreational and Historical Parks:

  • A total of 438 visitors will be contacted at each site during the sampling period (438 visitors x 5 sites = 2,190 contacts).

  • It is expected that 438 (20%) visitors will completely refuse to participate. When possible we will use two minutes to record non response information (438 visitors x 2 minutes -15 hours).

  • The initial contact time for the visitors agreeing to participate (n=1,752) it is expected to be one minute, with an additional two minutes needed to ask the four for non-response bias check questions and provide instructions (1,752 participants x 3 minutes = 88 hours).

  • Based on similar studies conducted by the NPS VSP, it is expected that 65% (n=1,139) will spend the 20 minutes to complete and return the survey by mail (1,139 mail back x 20 minutes = 380 hours).

  • The total burden for the recreational and historical Parks is estimated to be 483 hours.


The total estimate respondent burden for this collection will be 1,690 hours.



Estimated Number of Contacts


Estimated Time to Complete (mins)


Estimated Respondent Burden (hrs)


Initial Contacts

National Parks

Other Parks


4,341

1,752


Initial Contacts

National Parks

Other Parks


3

3


Initial Contacts

National Parks

Other Parks


217

88


On-site refusals

National Parks

Other Parks


1,509

438


On-site refusals

National Parks

Other Parks


2

2


On-site refusals

National Parks

Other Parks


50

15


Total Completed Responses

National Parks

Other Parks


2,821

1,139


Total Responses

National Parks

Other Parks


20

20


Total Responses

National Parks

Other Parks


940

380



Total Respondent Burden

1,690





11.

Reporting Plan:

The study results will be presented in an internal agency report for the NPS Social Science Program and park managers, and at two meetings with NPS Social Science Program staff and other decision-makers. Response frequencies will be tabulated and measures of central tendency computed (e.g., mean, median, mode, as appropriate). The report will be archived with the NPS Social Science Program for inclusion in the Social Science Studies Collection as required by the NSP Programmatic Approval Process; and will also be posted on the Park Studies Unit VSP website at: http:/psu.uidaho.edu/vsp.reports.htm. Hard copies will be available upon request.


References:

Dillman, D. A., Smyth, J. D., & Christian, L.M. (2010). Internet, Mail, and Mixed-mode surveys: The tailored design method, 3rd Edition, Hoboken NJ: John Wiley & Sons, Inc.


Fowler, F.J. (1993). Survey Research Methods, 2nd Edition, Newbury Park, CA: SAGE Publications.


Rookey, B. D, Le, L., Littlejohn, M., & Dillman, D. A. (2012). Understanding the resilience of mail-back survey methods: An analysis of 20 years of change in response rates to national park surveys. Social Science Research, 41(6), 1404-1414.

Shape4

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCPSU
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy