1004-0NEW Recreation Survey Supporting Statement B 10.12.21 OMB

1004-0NEW Recreation Survey Supporting Statement B 10.12.21 OMB.docx

Surveys and Focus Groups To Support Outcomes-Focused Management (Recreation Survey and Focus Groups)

OMB: 1004-0217

Document [docx]
Download: docx | pdf

30-day Federal Register Version




U.S. DEPARTMENT OF THE INTERIOR

BUREAU OF LAND MANAGEMENT


PAPERWORK REDUCTION ACT SUBMISSION

Supporting Statement

PART B statistical mehtods


Surveys and Focus Groups to Support Outcomes-Focused Management (RECREATION SURVEY)


OMB Control Number 1004-0NEW


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Respondent Universe

The purpose of this social science information collection endeavor is to provide relevant public input to support outcome-focused management decisions required by BLM Planning for Recreation and Visitor Services Handbook (H-8320-1), approved in 2014.


BLM Field Offices will be selected to administer a visitor survey based on one of two conditions: 1) a forthcoming Land Use Plan (LUP) in which Recreation Management Areas (RMA) might be considered (e.g., high visitation, unique recreation opportunities, and unique natural features); or 2) to monitor achievement of LUP objectives in completed LUPs.


BLM Field Offices will be selected to administer a focus group based on one of three conditions: 1) a forthcoming LUP in which RMAs might be considered (e.g., high visitation, unique recreation opportunities, and unique natural features); or 2) a recently completed LUP in which Special Recreation Management Areas (SRMAs) were designated; or 3) in support of efforts to engage local communities and stakeholders in recreational planning and management consistent with the BLM national recreation strategy – “Connecting with Communities.”



Pre-LUP Visitor Survey

The respondent universe for this component of the information collection is all visitors 18 years of age or older at the BLM-managed sites for which the respective field office is gathering data in support of a Resource Management Plan (RMP).



Post-LUP Survey

The respondent universe for this component of the information collection is all visitors 18 years of age or older to BLM-managed areas with a completed LUP in which outcomes-focused objectives were established and the respective field office is gathering data in support of a RMP.



Focus Groups in Communities

The universe for the focus groups are residents, age 18 and older, of communities near BLM-managed area with a forthcoming LUP sites for which the respective field office is gathering data in support of a RMP. This might include elected officials, business owners, representatives of organizations and clubs, and members of the general public, including those from underserved communities. For some field offices, the visitor population might extend beyond the local community (e.g., non-local visitors, non-locally based guides or outfitters).


The table below summarizes the respondent universe and sample for surveys and focus groups.


Respondent universe and sample for surveys.

BLM field offices conducting surveys

Approximate # visitors in universe1

Visitors sampled at each area2

Total # visitors sampled

6

1,250,000

727

3,635

1Visitation varies across the sites; an annual average of 250,000 was used to calculate the number of visitors in the universe.

2This is the total number of visitors completing an onsite survey.


Communities and participants for focus groups.

BLM field offices conducting focus groups 1

Approximate # of focus groups across communities2

Participants in each focus group3

Total # of participants

5

6

25

750

1The number of gateway communities near areas with forthcoming LUPs vary across the sites, an average value of 3 was used to calculate the total number of gateway communities.

2The focus groups will target specific audiences; e.g., elected officials/community leaders, business interests, the general public, visitors, underserved communities, etc. The audiences targeted will vary by field office/community. On average there will be 2 focus groups per community (i.e., 3 communities, with 2 focus groups each = 6 focus groups).

3As representation is not the goal of the focus groups, residents of the gateway communities will not be sampled. Rather, residents will voluntarily participate after being exposed to various recruitment efforts (see question 2). The number of participants will vary at each focus group. Based on previous, similar focus groups it is expected participation will average 25.

4On average, there will be 2 focus groups at each community.


Expected Response Rate for Visitor Survey

Projects conducted in Colorado (Kremmling, Alpine Loop, Glenwood and Gateway Canyons) by independent university researchers used a similar approach (i.e., a short onsite survey, with a longer follow-up survey). Those university-led studies, which formed the template for this BLM research effort, yielded an average onsite visitor survey response of 90 percent. The subsequent mail survey (with a modified Dillman methodology with a 1-week postcard reminder and second mailing) yielded an average response rate of 46 percent. Likewise, the federal lands Collaborative Visitor Transportation Survey utilized similar methods, but with an 8-page onsite survey, and achieved an 80% onsite response rate and a 54% and 48% response rate to the email and mail follow-up survey, respectively. Based on response rates from those previous studies, the fact that that the respondents are interested in BLM lands (i.e., they have self-selected to visit BLM lands), the utilization of the reminder postcards, and by allowing visitors the option to complete the survey online (as well as by mail) we estimate that the final response rate should exceed 50 percent.



Expected Response Rate for Focus Groups

The focus group questions, scripts, and methodology are the result of over 15 years of development – the last 3 years incorporated the use of electronic clickers by focus group respondents. While response rate is not an applicable concept, focus group methods provide anonymity to respondents which will minimize item nonresponse.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Visitor Survey

The onsite and internet/mail survey methodology will employ random sampling of visitors to BLM managed areas. Two purposes of this information collection are 1) to document visitor characteristics and preferences for management; and 2) in some areas, assess if visitors to potential Recreation Management Zones (RMZs) differ on key variables included in the survey (e.g., trip characteristics, preferences for recreation settings, desired onsite experiences, and anticipated longer-term benefits). If the potential RMZs do exhibit large differences, management of the zones can be tailored to accommodate difference in the key variables. Thus, results need to exhibit a degree of accuracy to assess overall visitor characteristics and sufficient power to determine when zones exhibit a moderate effect size of key variables (e.g., a 20% difference in visitors’ preferences for setting characteristics in the different zones, a mean difference in desired onsite experiences of 1 on a 5-point scale, etc.). When potential RMZs are being considered, BLM staff and the lead institution will work together to map the area or potential Special Recreation Management Area (SRMA) into smaller geographic zones that likely contain unique recreational values and opportunities (i.e., areas that have the potential to be managed as a RMZ. Within the area/SRMA, no more than 6 potential RMZs will be identified.


At each area/potential SRMA, 766 visitors will be contacted during the study period allowing for the completion of 400 follow-up surveys (based on 95% of respondents completing the onsite survey and 55% completing the follow-up survey). Four-hundred completed responses will allow overall estimates to within +/- 5% of the population value. The typical ingress points to each of the zones will be identified and included in the sample design. The days during the study period will be divided into late morning and early afternoon time blocks (time period in which surveying occurs). The number of time blocks sampled for each zone will vary based upon estimates of total visitation and weekend vs. weekday visitation (by recreation planners). Sampling targets will then be established for both weekend and weekday visitor contacts in each zone.

For areas/SRMAs with 6 potential RMZs, the sample will be designed to contact and sample a minimum of 106 onsite visitors from each potential RMZ, resulting in 66 completed follow up surveys from visitors to that zone. A sample size of 66 for each potential RMZ will result in power of .90 (two tailed test) to detect a 1 point difference if data from the RMZs each have a standard deviation of 1.751, a standard deviation consistent with previous research.


  • For high recreation use potential RMZs, time blocks will be reduced in length so that a minimum of five time blocks are randomly selected. Time blocks within the zone will be allocated across weekends and weekdays, in proportion to use, following methods outlined in Survey research and analysis: Applications in parks, recreation and human dimensions2.

  • For low recreation use potential RMZs, the number of time blocks necessary to sample 106 onsite visitors will be determined and randomly assigned. Time blocks within the zone will be allocated across weekends and weekdays, in proportion to use, following methods outlined in Survey research and analysis: Applications in parks, recreation and human dimensions2.


A completely random sample (e.g., assigning a number to each possible time block at each sample location and randomly selecting numbers) would result in several sample locations being selected at the same time. Accounting for the unique characteristics of each site, a sampling plan will be designed that will allow some element of random selection, as well as mitigate the excessive staff, time and financial resources that would be required to produce a completely random sample. For example, if sampling locations are nearby the community where the surveyors reside, we would randomly select days to be sampled, then randomly select the locations to be sampled (based on available surveyor resources). If sampling locations are remote and/or located a long distance from each other, requiring a multi-day trip to sample, trips based on logical travel patterns would be developed with order of sampling locations within any trip randomized along with the starting day of the trip.


The onsite survey is designed to be completed by visitors entering or exiting the site. During the onsite sampling, if a group (as opposed to a single visitor) is entering/exiting, the interviewer will request the person with most recent birthday complete the survey. This will allow for randomization within groups. For low use zones, each visiting group will be sampled, at high use zones every third visitor (or visiting group) will be sampled, with the first visitor/group determined by a randomly selected number between 1 and 10.


Focus Groups

As a random sample is not the goal of the focus groups, participants would be solicited through a variety of methods including agency lists of key stakeholder groups; outreach to BLM partners; outreach to leaders or organizations within underserved communities; BLM field office websites; flyers at visitor centers, information kiosks, BLM offices, public spaces of gateway communities, and local hotels and restaurants; and local newspaper(s). While the focus groups are not intended to provide data that represent the population with a specific margin of error (e.g., +/- 5%), a diversity of participants is desired. Thus we will ensure the above advertisements reach a diversity of relevant individuals and/or groups. In addition, the focus groups will include questions to measure participant affiliation and other characteristics (e.g., demographics), providing a measure of diversity.


Less Frequent Than Annual Data Collection

The information collected is required as part of the BLM’s land use planning process. Specifically, information is required prior to the development of a Land Use Plan (LUP) and after the plan is complete to monitor conditions specified in the LUP. The post-LUP survey will be conducted with the maximum time interval to monitor conditions, and will include only the questions necessary to monitor conditions specified in the LUP.


3. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


To maximize response rates and deal with nonresponse bias, the following steps will be taken.


Training for Personnel Who Will Administer Onsite Survey

In order to maximize the onsite survey response rate, field researchers will go through extensive training. The survey personnel who administer the onsite survey will be trained prior to working in the field by the lead researcher. The training will address the following topics and exercises:

  • A basic orientation to the study purpose including the unique study objectives from the planning, marketing, and operations perspectives;

  • A basic orientation to sampling theory and the need for randomness;

  • An overall review of the study methodology and sampling strategy. Discussion of the weekend and weekday sampling strategy, sample times, and respondent selection process;

  • An orientation to the onsite survey instrument, proper administration and completion procedures, and the subsequent follow-up survey;

  • An orientation to interviewing techniques and the role of an interviewer; and

  • Actual practice in administering the onsite and follow-up surveys, both in the training session and onsite.


Visitors will be contacted at trailheads and entrances to the recreation areas. The onsite survey will be brief (respondent burden about 5 minutes). If visitors show a reluctance to participate, field surveyors will reinforce the importance of visitor feedback and their particular input, followed by a final request to participate in the onsite survey again.


The Tailored Design Method

To maximize response rates, the internet or mail follow-up survey will utilize the Dillman Tailored Design Method3. The initial contact will be followed up with a thank you/reminder email [post card] at 1 week and a second email [mailing] of the survey to non-respondents after 2 weeks.


Multiple Options for Completing the Survey

We will provide respondents with two options to complete the follow-up survey: an online version that is completed through a website, or a paper version that is completed and returned by mail. This should increase response rates by allowing respondents to select the option they prefer to complete.


Addressing Potential Nonresponse Bias

Nonresponse bias will be assessed for both the onsite and internet/mail follow-up survey.


For the pre-Land Use Plan (LUP) onsite survey, if a visitor refuses to participate, they will be thanked for their time and non-respondent data will be collected through brief consultation with the non-respondent, if willing, or observation, if not (e.g., the surveyors best assessment of gender, group size, type of recreation activity, state of vehicle license). The non-response variables will allow comparison to characteristics of respondents. The day of week and time of contact will be recorded for refusals and completed surveys. This will allow a statistical check for non-response bias. The post-LUP onsite survey will follow similar methods; as it does not contain the activity question, the type of recreation activity will be observed and recorded for all contacts. In addition, the data from the onsite survey serves as a baseline to assess, and correct through weighting if needed, the representation of the follow-up surveys.


The sampling frame for the follow-up internet/mail survey will be derived from the onsite survey, thus the onsite survey will provide several variables to statistically compare follow-up survey respondents to nonrespondents.


If the non-response bias test suggests results might not represent the population, the data can be weighted prior to reporting.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Many of the questions included in this information collection have a long history of testing for reliability and validity. For example, a key set of questions to measure desired onsite experiences uses the Recreation Experience Preferences Scales, which has been subject to extensive testing4 . In addition, many of the questions included in this survey have been used in previous studies. Many questions in this survey are included in the National Park Service’s Pool of Known Questions, the Federal Land Management Agencies Compendium of Questions (OMB control no. 0596-0236), and the Fish and Wildlife Service’s Survey of National Wildlife Refuge Visitors (OMB control no. 1018-0145).


The proposed procedures are very similar to other studies. For example, the OMB approved National Park Service Visitor Services Project conducts an on-site survey and follow-up mail/online surveys. The Fish and Wildlife Service’s Survey of National Wildlife Refuge Visitors contacted visitors onsite and participants completed a survey as either an online or a mail survey.


As highlighted in the response to question 1, Supporting Statement B, university researchers have conducted studies specific to BLM lands using methods and questions proposed in this information collection.


All of the above have proven successful.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The survey and focus group materials have been developed by Dr. Randy Virden, Professor Emeritus, School of Community Resources and Development, Arizona State University; Dr. T. Timothy Casey, Professor of Political Science, Colorado Mesa University and Director of CMU Natural Resources Center; and Dr. Peter Fix, Professor of Outdoor Recreation and Chair, Department of Natural Resource Management, University of Alaska Fairbanks. These three individuals have extensive experience in conducting either recreational visitor surveys or focus groups related to public lands management.


Collection and analysis agency:

The Bureau of Land Management will be the lead federal agency for data collection and will work with the lead institution, as outlined in Supporting Statement A, to gather and analyze the data.


Statistical consultants:

Dr. Joseph Little, Associate Professor and Director of the Master of Science in Resource and Applied Economics Program at the University of Alaska Fairbanks Current affiliation: W.A. Franke College of Business, Northern Arizona University (928) 523-7409.




###




1 Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175-191.

Glass, G. V & Hopkins, K. D. (1996). Statistical tests in education and psychology. New York, NY: Pearson.

2 Vaske, J. J. (2008). Survey research and analysis: Applications in parks, recreation and human

dimensions. State College, PA: Venture Publishing.

3 Dillman, D. A., Smyth, J. D., & Melani Christian, L. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th ed. Hoboken, NJ: John Wiley & Sons.

4 Manfredo, M. J., Driver, B. L., & Tarrant, M. A. (1996). Measuring leisure motivation: A meta-analysis of the recreation experience preference scales. Journal of Leisure Research 28(3), 188-213.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement B for
Authorswanne
File Modified0000-00-00
File Created2021-10-21

© 2024 OMB.report | Privacy Policy