Justification for CSS-1

CSS-1-JUSTIFICATION-2015.doc

DOI Programmatic Clearance for Customer Satisfaction Surveys

Justification for CSS-1

OMB: 1040-0001

Document [doc]
Download: doc | pdf




U.S. Department of the Interior

Office of Policy Analysis





Guidelines for Accessing the Department of the Interior’s Generic Clearance for Customer Satisfaction Surveys







ATTACHMENT 1:


Instructions for Completing Supporting Statement for DOI Generic Clearance Submission, OMB Approval Number 1040-0001



  1. Survey Title/Date Submitted to the Office of Policy Analysis (PPA): Insert title for the proposed survey. Insert date that the expedited approval package will be submitted to PPA. Reminder: Please submit the package through your bureau/office Information Collection Clearance Officer.

  2. Bureau/Office: Insert the name of the bureau/office conducting the survey.

  3. Abstract: Summarize the proposed study with an abstract not to exceed 150 words.

  4. Bureau/Office Point of Contact Information: Complete the bureau/office contact information. PPA will communicate with the point of contact listed here throughout the entire approval process.

  5. Principal Investigator (PI) Conducting the Survey: Complete information about the PI who will be conducting the survey, if different than Point of Contact listed in #4. Otherwise note: Same as #4.

  6. Name of Program Office Conducting Survey: Provide the name of the bureau program, office, or organizational unit conducting the survey.

  7. Description of Customers/Services Provided: Provide a brief description of the customers who will be surveyed, the services provided by the program conducting the survey, and how these services are provided to customers.

  8. Survey Dates: List the time period in which the survey will be conducted, including specific starting and ending dates. The starting date should be at least 45 days after the submission date. The request for expedited approval, and submission of a complete and accurate approval package, must be made at least 45 calendar days prior to the first day the PI wishes to administer the survey instrument to the public.

  9. Type of Information Collection Instrument: Check the type(s) of information collection instrument(s) that will be used. If other, please explain.

  10. Survey Development: Explain how the survey was developed. With whom did you consult during the development of the survey on content? On statistics? Did you pretest the survey? What actions did you take to improve the survey? What suggestions did you receive for improving the survey? Which of the six topic areas will be addressed? (Note: A description of any pre-testing and peer review of the methods and/or instrument is highly recommended.)

  11. Survey Methodology: Explain how the survey will be conducted. Provide a description of the survey methodology including: (a) How will the customers be sampled? (if fewer than all customers will be surveyed); (b) What percentage of customers asked to take the survey will respond, and (c) What actions are planned to increase the response rate? If statistics are generated, this description must be specific and include each of the following:

- The respondent universe,

- The sampling plan and all sampling procedures, including how individual respondents will be selected;

- How the instrument will be administered;

- Expected response rate and confidence levels; and

- Strategies for dealing with potential non-response bias.

Note: Web-based surveys are not an acceptable method of sampling a broad population. Web-based surveys must be limited to services provided by the web.

12. Total Number of Initial Contacts/Expected Number of Respondents: Provide an estimated total number of initial contacts and the total number of expected respondents.

13. Estimated Time to Complete Initial Contact/Instrument: Estimate the time to complete the initial contact and the survey instrument (in minutes).

14. Total Burden Hours: Provide the total number of burden hours. The total burden hours should account for the amount of time required to instruct the respondents in completing the survey, and the amount of time required for the respondent to complete the survey.

15. Reporting Plan: Provide a brief description of the reporting plan for the data being collected. A copy of all survey reports must be archived with PPA. Please note this in the reporting plan.

16. Justification, Purpose and Use: Provide a brief justification for the survey, its purpose, goals, and utility to managers. Specifically, describe how data will be tabulated and what statistical techniques will be used to generalize the results to the entire customer population. Describe how data from the survey will be used. Describe how you will acknowledge any limitations related to the data, particularly in cases where we obtain a lower than anticipated response rate. Note whether or not the survey is intended to measure a Government Performance and Results Act (GPRA) performance measure.

ATTACHMENT 2:


Approval Form for DOI Programmatic Clearance for Customer Satisfaction Surveys (OMB Control Number 1040-0001)


U.S. Department of the Interior

Office of Policy Analysis (PPA)

PPA Tracking Number: (for PPA use only)


CSS-1



Date Submitted to PPA:

6/15/2015

1.

Survey Title:

BLM Visitor Satisfaction Survey

2.

Bureau:

Bureau of Land Management (BLM)



3.

Abstract: (not to exceed 150 words)

The purpose of the BLM Visitor Satisfaction Survey is to measure visitors’ opinions about BLM facilities, services, and recreational opportunities. This effort helps us meet requirements of the Government Performance and Results Act of 1993 (GPRA) and other BLM and Department of the Interior (DOI) strategic planning efforts. The results are used to measure performance related to the BLM GPRA goals regarding visitor satisfaction, BLM and DOI recreation performance measures, and by BLM managers to improve services and recreational opportunities provided to visitors, while protecting BLM resources.



4.

Bureau/Office Point of Contact Information


First Name:

Victoria


Last Name:

Josupait



Title:

Outdoor Recreation Planner




Bureau/Office:

BLM



Street Address:

National Operations Center - DRS

DFC, Bldg. 50, PO Box 25047, OC-531


City:

Denver

State:

CO

Zip code:

80225


Phone:

303-236-6313

Fax:

303-236-3508



Email:

[email protected]


5.

Principal Investigator (PI) Information


First Name:

Jennifer


Last Name:

Hoger-Russell


Title:

Research Scientist





Bureau/Office:

University of Idaho- Park Studies Unit


Address:

Mailcode 441139


City:

Moscow

State:

ID

Zip code:

83844-1139


Phone:

208-885-4806

Fax:

208-885-4261


Email:

[email protected]




6.

Name of Program or Office Conducting Survey:

Bureau of Land Management Washington Offices Management Systems (WO 830) and National Recreation and Visitor Services Division (WO 250)

7.

Description of Customers/ Services Provided:

Customers: Recreation visitors and natural, cultural, and heritage tourists to BLM lands.

Services: BLM provides a host of services ranging from protecting natural and cultural resources to offering recreation and tourism such as facility/trail/road maintenance, visitor information, interpretive and educational materials, and general recreation use management.


8.

Survey Dates

(mm/dd/yyyy)

to

(mm/dd/yyyy)



6/15/2015


9/30/2018

9.

Type of Information Collection Instrument (Check ALL that Apply)

_ X _Intercept

__Telephone

__Mail

__Web-based

Focus Groups

_ X _Comment Cards

_ X _Other

Explain:

The survey will be a self-administered; the on-site survey is handed to people at specified locations by a BLM survey research technician.



10. Survey Development:

(Who assisted in survey content development statistics? Was the survey pretested? How were improvements integrated? Which of the six topic areas will be addressed?)

This survey is the culmination of experience gained in using two prior approved versions. The first version of this survey that included non-GPRA items was created with input from customer focus groups, program staff at the field level and program leads at the state and national levels respectively. This first version survey was developed and tested with assistance from Kevin Coray, Ph.D., of Coray Gurnitz Consulting, Inc. This survey was successfully administered from 1999 to 2001 and 2003-2004.


The second version of the survey instrument and the associated methodology were developed in consultation with the University of Idaho Park Studies Unit (UI PSU), specifically Steve Hollenhorst, Ph.D., Director, National Park Service Visitor Services Project/Associate Dean for Outreach and Engagement, Jennifer Hoger-Russell, M.S., Research Scientist/Principal Investigator, and Yen “Lena” Le, Ph.D., Research Scientist/Statistician. Brian Forist, Senior Research Associate with the National Park Service’s Social Science Program (2005) also provided input.


This survey instrument and methodology was developed with the UI PSU staff identified in the previous paragraph. This survey instrument includes two new satisfaction-based questions required to meet the BLM’s GPRA reporting requirements.


This instrument and methodology meet the Department of the Interior (DOI) guidelines for the generic clearance for customer satisfaction surveys and the GPRA reporting requirements. This instrument retains the previous customer satisfaction items used to comply with GPRA reporting requirements along with a few demographic questions (number in group, gender, age and zip code) that are used to perform non-response bias checks. Retaining the original satisfaction questions allows for long-term consistency and for tracking trends in comparison to the results obtained under previous versions of the survey. The continuation of the demographic questions provides the recreation site managers with basic information about which types of visitors are satisfied with which facilities, services, and recreational opportunities in efforts to better meet their needs. This limited demographic information also allows the BLM to assess survey non-response bias.


11. Survey Methodology:

(Use as much space as needed; if necessary include additional explanation on separate page).

Respondent Universe

The population to be sampled is defined as all visitors, 18 years and older, at approximately 20-25 BLM sites annually each fiscal year.

Sampling Plan/Procedure

This will be an intercept, comment card survey. The instrument is considered a comment card because it is a one-time survey, with no follow up and no application of the Dillman method. Intercept surveys are an effective method for both reaching this type of customer segment and for obtaining a maximum response rate when contact information (i.e. name, phone number, mail address, email address, etc.) is not collected. Surveys will be distributed on-site by a trained survey technician (BLM employees or uniformed volunteers). Respondents will be given time and privacy to fill out the survey before depositing it into a locked drop box. The Principal Investigator (PI), the University of Idaho Park Studies Unit (UI PSU), will provide locked drop boxes for survey collection. BLM employees and volunteers will not have the ability to open the locked boxes. Past experience with this methodology strongly suggests that the face-to-face contact combined with on-site completion of the survey increases the response rate significantly. Instructions and guidelines will be issued to BLM staff at each site to explain proper sampling procedures in order to achieve the most representative and random sample possible. Included in these instructions will be procedures for recording the number of contacts with potential respondents, the number of refusals, and the number of surveys distributed (see Appendix 3 and 4).


Each location will administer the survey at a time best suited for its needs, usually during a high-usage part of the recreation season and for a set period of time (i.e. two months with a proportional number of weekdays and weekends). After completion of the survey period the BLM staff will mail the locked drop boxes containing the surveys to the project PI for data entry and analysis. Given past experience with the survey, the response rate is expected to be 70% at a minimum.

Daily start and end times for sampling will be flexible to coincide with historical patterns of use at each location, or when not available by staff’s best judgment of use patterns.


The survey takes approximately 8 minutes to complete.

Instrument Administration

A survey technician will be stationed at a primary location during the optimal sampling time for that location. During sampling times, systematic selection of visitors (e.g., every nth individual selected) will vary depending on use levels and season. Only one individual per group will be asked to participate. Individuals chosen for surveying will be informed that the BLM is conducting a visitor study. Using a script as a guide (see Appendix 1), the survey technician will explain that the purpose of the survey is to measure overall satisfaction of visitors and results will guide future management of the BLM lands. Visitors will be told that participation is voluntary and responses will be confidential and anonymous. The survey technician will select the nth person (18 years or older). Those who agree will receive a clipboard and pencil with a copy of the survey (Appendix 2). The visitor will complete the survey on his/her own with accorded privacy at the point of interception and deposit it directly into a locked metal drop box. The survey technician will collect the following information in a survey log (see log form in Appendix 3):

  • Date

  • Start and end times for each location

  • Number of people approached to take survey

  • Number of refusals

  • Weather conditions

  • General observations of procedures

Expected Response Rate Confidence Levels:

Surveys will be distributed to approximately 5,000 individuals (between 125 and 400 surveys in each of 20-25 BLM sites). Based on administration of the past survey version, a minimum of 3,500 responses is expected (70% response rate) - actual response rate with the previous survey version and methodology has been 83%. This response rate is very high compared to customer service evaluations in the private sector using comment cards. This high response rate may be attributed to the personal contact from a survey technician and the survey methodology requiring the survey to be completed and collected at the time of interception.

Strategies for dealing with potential non-response bias

At selected sites on specific sampling days, the demographics of individuals who refuse the survey will be collected via a very short (3 question) interview and recorded on a separate tally sheet (Appendix 4). Demographic questions in the short interview ask group size, zip code, and overall satisfaction at the location. Gender will be observed. For those who do complete the survey, this demographic information is collected via the survey instrument. Following the completion of the survey season, the data will be analyzed to determine if a non-response bias exists. If detected, steps to reduce the bias level will be implemented in future survey seasons.

Description of any pre-testing and peer review of the methods and/or instrument (recommended)

The first version of the survey was created with input from customer focus groups, program staff at the field level and program leads at the state and national levels, and university researchers. The survey instrument was developed and tested with assistance from Coray Gurnitz Consulting, Inc. The survey was successfully conducted from 1999-2001 and from 2003-2004.

The proposed version of the survey and methodology has been developed in consultation with researchers at the UI PSU to add two questions to meet GPRA reporting requirements. The previous version of the survey was approved under OMB control #1040-0001 and had been administered with great success and has a response rate of 83%.


The National Park Service has successfully used a similar methodology since 1998.


12.

Total Number of Initial Contacts/ Expected Number of Respondents

5000/3500

13.

Estimated Time to Complete Initial Contact/ Instrument (mins.):

1/8

14.

Total Burden Hours:

575 – This includes an additional 25 hours for contacts with visitors who declined survey and/or involved in non- response bias check. (1500 refusals/non-response bias checks at one minute each. 1500/60 = 25 hours

15. Reporting Plan:

The University of Idaho will provide the BLM with a National Report as well as summary reports for each location.

16. Justification, Purpose, and Use:

Survey Justification and Purpose

The Government Performance and Results Act of 1993 (P.L. 103-62) requires that the BLM develop goals to improve program effectiveness and public accountability, and to measure performance related to these goals. This visitor satisfaction survey measures the performance toward those goals through a short scannable visitor survey.


The BLM has a need to better understand visitors so that limited agency staff and budget resources can be most effectively applied in providing for quality recreation opportunities. The purpose of the study is to gain baseline and trend information on levels of satisfaction of visitors seeking outdoor recreation opportunities on BLM lands. The survey also provides site managers with information about the strengths and weaknesses of visitor service at the site, the service needs of visitors, as well as ideas and suggestions for improving customer service (see open-ended questions at the end of the survey – Appendix 2).


Data from the survey are needed to assess performance on BLM GPRA goals 1.3.08, 3.1.02, 3.1.11, and 3.2.01. These goals are tied to DOI Strategic Goals are as follows:


DOI Strategic Goal 1: Resource Protection: Protect the nation’s natural, cultural, and heritage resources.


End Outcome Goal 1.3: Protect Cultural and Natural Heritage Resources.

BLM Measure 1.3.08: National Monuments and National Conservation Areas: Initiate priority projects to achieve the resource condition objectives for National Monuments and National Conservation Areas.


DOI Strategic Goal 3: Recreation: Provide recreation opportunities for America.

End Outcome Goal 3.1: Provide for a Quality Recreation Experience, including Access and Enjoyment of Natural and Cultural Resources on DOI Managed Lands and Partnered Lands and Waters.

BLM Measure 3.1.01: Satisfaction with the quality of recreation experience.

BLM Measure 3.1.11: Interpretation and Education: Satisfaction with the quality of interpretation and environmental education products in Special Recreation Management Areas as measured by a survey.

BLM Measure 3.1.15: Satisfaction with services provided by commercial recreational operations.


End Outcome Goal 3.2: Provide For and Receive Fair Value in Recreation.

BLM Measure 3.1.16: Customer satisfaction with value for fee paid.



Survey Goals

- To conduct a scientifically sound annual satisfaction study of BLM recreation visitors.

- To better understand the level of visitor satisfaction with a range of offerings (facilities, interpretation products, road and trail maintenance, etc.) for the purpose of allocating limited agency resources to maintain or improve the quality of recreation opportunities and experiences on BLM Lands.

- To better understand the strengths and weaknesses of services both at the site level and at an overall national level.

- To meet BLM GPRA performance requirements.


Utility to Managers

The results of this study will provide managers with baseline information about visitor satisfaction with current opportunities, information, education/interpretive programs, facilities, and value for fee paid. The survey provides site managers with information about the strengths and weaknesses of visitor service at the site, the service needs of the visitors, some demographic characteristics of who their visitors are, as well as ideas and suggestions for improving customer service (see open-ended questions at the end of the survey - Appendix 2). This information will support planning as outlined in the DOI Strategic Plan. Field office and site managers will be able to use this information to allocate limited agency resource to improve performance where satisfaction levels are low.

How will the results of the survey be analyzed and used?

Basic descriptive statistics (frequencies, measures of central tendency) will be provided for all questions. Close-ended questions will be scanned and analyzed using SPSS software and, where practical, frequency data will be presented graphically. Responses to open-ended questions will be provided verbatim to field and site managers. Site specific reports will be distributed to the participating offices and the Washington Office. Additionally, the WO will receive a national-level report; all reports are posted on the University of Idaho Park Studies Unit web site at: http://psu.uidaho.edu/blm/index.htm.



How will the data be tabulated? What Statistical Techniques will be used to generalize the results to the entire customer population? How will limitations on use of data be handled? If the survey results in a lower than anticipated response rate, how will you address this when reporting the results? (Use as much space as needed; if necessary include additional explanation on separate page).

Returned surveys will be electronically scanned and the data analyzed. Responses from individual sites in the system will be combined into one dataset. Data from sites with less than 30 returned survey cards, or from sites with discrepancies in the data collection methods will be omitted from the system report. Frequency distribution will be calculated for each indicator and category.

Limitations of the use of the data will be clearly conveyed in the report documents. Reports will convey to the reader that the results do not necessarily apply to visitors during other times of the year, to visitors who did not visit the survey locations, or to other units in the system that did not participate in the survey.

Historically this survey has not had difficulties with low response rates, however, if a low response was encountered the report would indicate this with specific language noting that low survey response rates increase the probability of non- response bias. Non-response bias occurs when those who choose to participate in a survey differ substantially and systematically from those who choose not to participate. If these differences are related to GPRA measures, the results may be unreliable.



Is this survey intended to measure a Government Performance and Results Act (GPRA) performance measure? If so, please include an excerpt from the appropriate document. (Use as much space as needed; if necessary include additional explanation on separate page).

The following are the relevant DOI Strategic Goals, End Outcomes, and BLM Measures as identified in the U.S. Department of the Interior GPRA Strategic Plan Fiscal Years 2007-2012:

DOI Strategic Goal 1: Resource Protection: Protect the nation’s natural, cultural, and heritage resources.

End Outcome Goal 1.3: Protect Cultural and Natural Heritage Resources.

BLM Measure 1.3.08: National Monuments and National Conservation Areas: Initiate priority projects to achieve the resource condition objectives for National Monuments and National Conservation Areas.


DOI Strategic Goal 3: Recreation: Provide recreation opportunities for America.

End Outcome Goal 3.1: Provide for a Quality Recreation Experience, including Access and Enjoyment of Natural and Cultural Resources on DOI Managed Lands and Partnered Lands and Waters.

BLM Measure 3.1.01: Satisfaction with the quality of recreation experience.

BLM Measure 3.1.11: Interpretation and Education: Satisfaction with the quality of interpretation and environmental education products in Special Recreation Management Areas as measured by a survey.

BLM Measure 3.1.15: Satisfaction with services provided by commercial recreational operations.


End Outcome Goal 3.2: Provide For and Receive Fair Value in Recreation.

BLM Measure 3.1.16: Customer satisfaction with value for fee paid.



ATTACHMENT 3


Checklist for Submitting a Request to Use DOI Programmatic Clearance for Customer Satisfaction Surveys


X All questions in the survey instrument are within the scope of one of the DOI Programmatic Clearance for Customer Satisfaction Surveys topic areas.



X The approval package is being submitted to the Office of Policy Analysis at least 45 days prior to the first day the PI wishes to administer the survey to the public.



X A qualified statistician has reviewed and approved your request.



X Your bureau/office Information Collection Clearance Officer has reviewed and approved the approval package.



The approval package includes:


X A completed Information Form

X A signed Certification Form

X A copy of the survey instrument

X Other supporting materials, such as:

    • Cover letters to accompany mail-back questionnaires

    • Introductory scripts for initial contact of respondents

    • Necessary Paperwork Reduction Act compliance language

    • Follow-up letters/reminders sent to respondents



The survey methodology presented on the Information Form includes a specific description of:

X The respondent universe

X The sampling plan and all sampling procedures, including how respondents will be selected

X How the instrument will be administered

X Expected response rate and confidence levels

X Strategies for dealing with potential non-response bias

X A description of any pre-testing and peer review of the methods and/or the instrument is highly recommended.




X The burden hours reported on the Information Form include the number of burden hours associated with the initial contact of all individuals in the sample (i.e., including refusals), if applicable, and the number of burden hours associated with individuals expected to complete the survey instrument.


X The package is properly formatted (Word) and submitted to the Office of Policy Analysis electronically. 


ATTACHMENT 4

CERTIFICATION FORM FOR SUBMISSION UNDER OMB CONTROL NUMBER 1040-0001

This form should only be used if you are submitting a collection of information for approval under the DOI Programmatic Clearance for Customer Satisfaction Surveys.

If the collection does not satisfy the requirements of the Programmatic Clearance, you should follow the regular PRA clearance procedures described in 5 CFR 1320.

  1. Bureau/Office Subgroup or Program:

  2. BLM

  1. Title (Please be specific) : BLM Visitor Survey


  1. Burden Hour


Estimate Number of Respondents

3500

  1. Total Burden Hours

575 (includes an additional 25 hours for contacts with visitors who declined survey and/or involved in non- response bias check)


Hours/Min per Response

0/8

  1. Bureau/Office Contact (who can best answer questions about content of the submission):

  1. Name

Victoria Josupait

Phone

303-236-6313, email: [email protected]

  1. Certification: The collection of information requested by this submission meets the requirements of OMB control number 1040-0001

  1. Bureau/Office Qualified Statistician

  2. Yen “Lena” Le Ph.D., University of Idaho, Park Studies Unit, P.O. Box 441139, Moscow, ID 83844 Phone: 208-885-2819, email: [email protected]

DATE


8/13/09

  1. Bureau/Office Information Collection Clearance Officer



Jean Sonneman, Acting

DATE


9/18/2009

  1. Office of Policy Analysis



Donald J. Bieniewicz, DOI Information Collection Coordinator

DATE


06/17/2015

  1. OMB, Office of Information and Regulatory Affairs (OIRA)

DATE




1



File Typeapplication/msword
Authorpondsp
Last Modified ByDonald Bieniewicz
File Modified2015-06-17
File Created2015-06-17

© 2024 OMB.report | Privacy Policy