SUPPORTING STATEMENT
Social Values of Ecosystem Services (SolVES) in Marine Protected Areas for Management Decision-Making
OMB Control No. 0648-xxxx
Respondent Universe and Sample Size Estimation for Resident Surveys
Mission-Aransas NERR
The potential study universe for the resident data collection effort includes any person (age 18 or older) living in one of the five counties adjacent to the Mission-Aransas NERR site. These are: Aransas, Calhoun, Refugio, San Patricio, and Nueces. According to the 2010 U.S. Census, the total population of residents over the age of 18 for all five counties is 338,444. Given this information, and in order to generalize to the population, a response from a total of 384 residents is needed to be within a confidence level of 95%. However, given the fact that mail-back surveys typically achieve a response rate of only 20%-30%1, a total of 1,535 surveys must be sent out in order to reach the needed 384 valid responses so that researchers may generalize to the population at the 95% confidence level.
Olympic Coast NMS
The potential study universe for the resident data collection effort includes any person (age 18 or older) living in one of the five counties adjacent or nearest to the Olympic Coast NMS site. These are: Clallam, Grays Harbor, and Jefferson. According to the 2010 U.S. Census, the total population of residents over the age of 18 for all three counties is 140,914. Given this information, and in order to generalize to the population, a response from a total of 383 residents is needed to be within a confidence level of 95%. However, given the fact that mail-back surveys typically achieve a response rate of only 20%-30%2, a total of 1532 surveys must be sent out in order to reach the needed 383 valid responses so that researchers may generalize to the population at the 95% confidence level. The table below depicts the sampling information for both the Mission-Aransas NERR and Olympic Coast NMS sites.
The table below displays the totals for both sites in terms of total population over the age of 18; the target sample size, response rate, the number of valid returns needed; time per response; total burden hours; and, the total labor cost of respondent burden.
Site |
Population >18 yrs. |
Target Sample |
Response rate |
Valid returns needed |
Time per response (in mins.) |
Total burden (in hours) |
Labor cost |
M-A NERR |
338,444 |
1,535 |
0.25 |
384 |
20 |
128 |
$1399 |
OC NMS |
140,914 |
1,532 |
0.25 |
383 |
20 |
128 |
$1567 |
Totals |
479,358 |
3,069 |
0.25 |
767 |
20 |
256 |
$2966 |
|
|
|
|
|
|
|
|
Non-response |
|
2,302 |
|
|
1 |
38 |
$0 |
|
|
|
|
|
|
|
|
Respondent Universe and Sample Size Estimation for Intercept Surveys
The
potential study universe for the intercept data collection effort
includes any person over 18 years of age visiting a NERR or NMS site
who engages in any non-commercial recreational activity from January
2014 to April 2015.
In NERR and NMS sites, non-commercial users are not generally
required to secure a permit or license to engage in most recreational
activities. Because use permits are not required for most
recreational activities within NERR and NMS sites, the total number
of users in the potential respondent universe is unknown.
However, based on management plans from a number of NERR and NMS locations3,4,5,6, the average number of visitors per year, per NERR/NMS site, is approximately 37,257. Given this, there are approximately 3,105 monthly visitors (37,257/12) to an average NERR/NMS site who engage in non-commercial recreational activities on a monthly basis and therefore comprise the potential universe for the intercept and interview approaches. It is unknown if the 3,105 monthly visitors to the NERR/NMS sites are unique visitors or returning visitors.
We expect to survey approximately 11 users per day in 30 sampling days for a minimum of 324 users surveyed per site. A response rate of 90% has been typical of similar intercept surveys. Thus, we anticipate a similar response rate for the present data collection; a non-response rate of 10% is assumed. To accommodate a 10% non-response rate, a target sample size of 356 has been set for each intercept location:
Survey Site |
Avg. # of visitors/month |
Target Sample |
Response rate |
Valid returns needed |
Response time (min.) |
Burden hours |
Labor cost |
M-A NERR |
3,104 |
356 |
0.9 |
324 |
20 |
108 |
$1181 |
OC NMS |
3,104 |
356 |
0.9 |
324 |
20 |
108 |
$1322 |
Totals |
6,208 |
712 |
0.9 |
648 |
20 |
216 |
$2503 |
|
|
|
|
|
|
|
|
Survey Approach |
N |
Target Sample |
Response rate |
Valid returns needed |
Response time (min.) |
Burden hours |
Labor cost |
Residential |
479,358 |
3,069 |
0.25 |
767 |
20 |
256 |
$2966 |
Intercept |
6,208 |
712 |
0.9 |
648 |
20 |
216 |
$2503 |
TOTAL |
485,556 |
3,781 |
-- |
1415 |
20 |
472 |
$5469 |
Sample Selection for Resident Surveys
Using the Sampling Tool 10 for ArcGIS, created by NOAA/ National Center for Coastal Ocean Science (NCCOS)7, a spatially oriented random sample of residents in the five counties adjacent to the Mission-Aransas NERR site and the three counties adjacent to the Olympic Coast NMS site will be established. These county groups were chosen at the request of the NERR and NMS staff and through analysis of populations most relevant to the site through the review of management plans. Then, using the reverse geo-coding tool available in ESRI’s ArcGIS environment, the spatially random points will be assigned a physical address based on road network data and address locator files generated from U.S. Census Bureau information. The physical address locations will be compared to tax assessor’s information for all eight counties and adjusted as necessary to ensure U.S. Postal Service delivery of the paper-based mail-back survey instrument.
Sample Selection for Intercept Surveys
NERR and NMS sites pose a particularly challenging sampling context for using intercept surveying techniques on resource users. Firstly, in the absence of a record of NERR and NMS site users, there is no readily identifiable sample frame for intercept surveys. Secondly, the shoreline of NERR sites and the open-water orientation of NMS sites make it relatively easy to access large portions of the management areas, meaning that there are many locations where resource users may walk and boat to in order to engage in recreational activities. There are only a few places in NERR and NMS sites where the public are prohibited from accessing, such as private property or where natural features impede access. In other words, access to the NERR and NMS management areas is largely open as opposed to being limited to just a few points where access-point intercept surveys could be reliably conducted. This circumstance makes execution of a stationary access-point survey problematic because many resource users would be missed if survey stations were set up at only a limited number of visitor access points.
Intercept surveys are ideally suited to locations where users may easily access a body of water from many different points. Additionally, intercept surveys are appropriate when researchers are interested in a particular body of water or a geographically bounded area, such as an island, lake or stretch of river. Consequently, we propose to supplement our paper-based mail-back survey efforts by collecting survey data using an intercept survey design. The intercept survey data collection method employed here will utilize a spatiotemporal frame for sampling.
The sampling design is a two-stage cluster sampling design. In the first stage, a sample of primary sampling units (PSUs), which we define as “access points,” will be selected. Next a sample of visitors (secondary sampling units, SSU) will be selected from each selected PSU.
There are 32 access points in the Mission-Aransas NERR and each access point has up to 5 activities and up to 10 facilities (henceforth both are collectively deemed “amenities”). Each access point will be ranked according to:
the number of activities at the location (Scale: 1-5)
the number of facilities available at the location (Scale: 1-5)
The access point sites will be clustered according to their location to lessen the travel burden on the surveyors. Depending upon the level of surveyor involvement (i.e., the number of volunteers and students to conduct the survey), areas containing 3 sites or fewer will have all sites surveyed; areas with more than 3 sites will have at least 3 locations randomly selected and surveyed from those appearing within the locator map.
We define every day in the 30 day sampling effort as a “visitation day.” Each visitation day is 12 hours in length (0700-1900) and will be divided into (4) - 3 hour increments: 0700-1000; 1000-1300; 1300-1600; and 1600-1900.
A number of NERR and NMS resource managers have informed us that, generally speaking, visitation is not overly intense for any given access point in protected areas. Therefore, we propose a census of shore-based stakeholders once the surveyor arrives at the survey site. After arriving at the survey site, the surveyor will interview every visitor encountered as he or she arrives at the access point. Per survey period, assuming interviews take approximately 20 minutes to complete, one surveyor could realistically complete 6 to 9 surveys during a survey period (3 hours). Should visitation to a particular access point be extremely high during an assignment, meaning that a survey site has more than 10 visitors at the time of the survey, the surveyor will systematically sub-sample visitors for inclusion by selecting every kth visitor for the survey instead of completing a census. To determine the k interval to be used, the surveyor will count or estimate the total number of visitors within line-of-sight at the survey location. For sites with 10 to 20 visitors visible, the surveyor will sample every 2nd visitor for survey. For sites with more than 20 visitors visible, the surveyor will sample every 3rd visitor until the survey period has concluded.
The SSU will be weighted so as to increase efficiency in sampling and to reflect expected visitation pressure. Anecdotal information from NERR and NMS resource managers indicates that visitation is highest in the afternoon (1300 to 1600) and evening hours (1600 to 1900) for both weekdays and weekend days. Consequently, time of day will be weighted based on anticipated visitation levels defined as the number of visitors expected by time of day (i.e., morning, mid-day, afternoon and evenings), with 1 representing low or no visitation and 2 representing high visitation, as follows:
Time Segment |
Weight |
0700 to 1000 |
1 |
1000 to 1300 |
1 |
1300 to 1600 |
2 |
1600 to 1900 |
2 |
Weights for the access points will be based on expected visitation levels. Each segment will be ranked numerically from 1 to 5, with 5 representing high visitor levels and 1 representing low visitor level. Rankings will be based on the number of known activities for the access point (on a scale from 1 to 5)8 and the number of facilities available for the access point (on a scale from 1 to 5)9. For each access point the factor weights will be averaged resulting in the final weight for each access point, thus:
Access Site ID |
Weight |
1 |
2 |
2 |
2 |
3 |
3 |
4 |
2 |
5 |
4 |
6 |
2 |
7 |
2 |
8 |
4 |
9 |
3 |
10 |
5 |
11 |
2 |
12 |
3 |
13 |
3 |
14 |
1 |
15 |
3 |
16 |
5 |
17 |
3 |
18 |
3 |
19 |
3 |
20 |
2 |
21 |
2 |
22 |
2 |
23 |
3 |
24 |
2 |
25 |
2 |
26 |
3 |
27 |
4 |
28 |
4 |
29 |
3 |
30 |
4 |
31 |
3 |
32 |
4 |
The access points and the time segments will be combined to create 128 access point-time units. Weights for each access point will be summed with the corresponding weight for each of the 4 time segment units. Below is an example of the weighting summary for Access Point 1 by each of the time segments:
Access Point ID |
Time Segment |
Access Point Weight |
Time Segment Weight |
Total Weight for Shoreline-Time Segment Unit |
|
|
1 to 5 |
1 to 2 |
(Access Point Weight + Time Segment Weight) 2 to 7 |
1 |
0700 to 1000 |
2 |
1 |
3 |
1 |
1000 to 1300 |
2 |
1 |
3 |
1 |
1300 to 1600 |
2 |
2 |
4 |
1 |
1600 to 1900 |
2 |
2 |
4 |
SSU will be selected using unequal probability sampling without replacement. The probability of an access point-time segment unit being selected into the sample is:
Where:
ATi access point-time segment unit, with a range of ST1 …ST128
WATi weight of access point-time segment unit, ATi
Sample Selection for Intercept Surveys
We will employ a two stage cluster, unequal sampling design for this data collection. For each stratum, as described above, first, we will select the day units using simple random sampling without replacement. We will then select the access point-time segment unit, weighted for both access point and time segment, for which we will use unequal probability sampling (probability proportional to size (pps)) without replacement.
Sample Selection for Resident Surveys
Attribute profiles for user activity, demographics and management preferences will be summarized using basic design-based univariate descriptive statistics. Associations between select independent (i.e., age, residence, place of birth, income, employment, etc.) and dependent variables (i.e., satisfaction with management activities, main reason for using the NERR/NMS site, frequency of NERR/NMS site use, etc.) will be examined using the chi-square test and Cochran-Mantel-Haenszel test for survey data. The SolVES GIS model that will be used for the spatial analysis of stakeholder respondents will develop spatial autocorrelation and average nearest neighbor statistics. Factor analysis will also be used to determine if any explanatory pattern exists within and amongst the variables found in the survey results.
There are no unusual problems that require specialized sampling procedures.
We
anticipate at least a 20%-30-% response rate for the mail-back
portion of the survey effort and approximately a 90% response rate
for the intercept data collection. For
both approaches we have developed a short survey so that we will not
unduly inconvenience users. To increase awareness about the study and
increase response rate we plan to work with the NERR and NMS
communication’s staff in outreach to engage local radio,
television, newspapers and newsletter editors in the research effort.
The goal of these activities is to inform the public about the need
for the data, explain its uses, and describe how the surveys will be
conducted. Additionally, we will enlist the help of local site staff
who are familiar with the NERR/NMS site, its culture and resource
uses, to complete the surveys. Local staff will be trained
extensively on appropriate field interviewing etiquette and protocol.
Our management collaborators indicate that resource users (especially
those in the Tribal areas associated with the Olympic Coast NMS) will
be more comfortable with a local person and, thus, be more willing to
participate in the survey.
The implementation of the mail surveys is based on the Dillman’s Tailored Design Method.10 This approach includes multiple steps and points of contact. First a postcard will be sent to potential respondents asking them to look out for the survey to come the next week. The survey mailing will include the questionnaire, a map, a pre-addressed stamped envelope and a detailed cover letter. The cover letter will explain the project, why a response is important, a statement indicating that all personal information will be kept confidential, and instructions for completing and returning the completed survey (via mail/fax/email). Color will be used on address labels to make the envelopes stand out. Surveys will be tracked using individual identification numbers. A follow-up thank you postcard will be sent seven to nine days after the questionnaire. The postcard will express appreciation for participating and will indicate that if the completed questionnaire has not yet been mailed, it is hoped that it will be returned soon. For the email invitations to take the internet surveys we will use a number of techniques11 to increase response including:
Subject lines on contact emails clearly indicating the purpose of the survey and explicitly avoiding SPAM language in the subject line or body of the message (i.e. title all caps)
Information on how the respondent’s name was obtained, the survey intention, the use of the data, and guarantees of anonymity
Personalized messages
Use of a “.gov” reply email address
Indication of how long the survey takes to complete and the cutoff date
Use of only clean and updated email lists
Scheduled regular reminders and follow-ups.
Having said the above, we have been advised by our local collaborators that questions related to the respondent’s income and employment may be objectionable and, thus, invoke a resistance to provide answers to these specific questions. To address this possibility we will develop a detailed “response to user questions” sheet for the surveyor to use when users inquire about why information is necessary. The surveyor will be instructed to rely on this set of answers and respond to any user questions about why we want such information. This will be done to increase user comfort with these questions. Additionally, for the income question, by recommendation of our reviewers, we will use an “income response card” so that a user may point to his or her response of choice. This type of option has been shown to increase a respondent’s comfort level when answering income questions because the respondent does not have to state their income level verbally around other people; it is more private. Nevertheless, by design, we have included these questions at the end of the survey in order to maximize the likelihood that the more significant portions of the surveys will be completed prior to engaging the user on these items. Prior to data analysis, we will assess the refusal rate for each item on the survey and discard from further analysis any item that is not statistically reliable.
In limiting non-response for the intercept approach to the survey administration it will be suggested to the resource managers that older volunteers be recruited as surveyors. In addition to the use of older volunteers, it will also be suggested that the surveyors mention NOAA or university affiliation and personally appeal to the respondent when introducing the survey. These approaches to minimize survey non-response are based on the compliance principles of authority and social validation.12 To further reduce heuristic decision-making responses by potential survey subjects, details regarding issues of anonymity, the purpose of the survey, and the difficulty of the survey questions will be presented. This approach is intended to allow the respondent to take a systematic approach to their decision-making regarding the choice of responding or not to the survey.13
In general, non-response analyses will be undertaken to assess the impact of non-response on data quality per guidance issued via the OMB Standards and Guidelines for Statistical Surveys. Response rates will be calculated for the collection as a whole as well as for each item on the survey. Where non-response is found to be an issue, we will examine patterns within the data to assess potential for presence of non-response bias in the data.
We will follow OMB designated best practices in determining unit and item response rates. These include calculating un-weighted and weighted response rates. To calculate un-weighted unit response rates the following formula will be applied:
Where:
C = number of completed cases or sufficient partials;
R = number of refused cases;
NC = number of non-contacted sample units known to be eligible;
O = number of eligible sample units not responding for reasons other than refusal;
U = number of sample units of unknown eligibility, not completed; and
e = estimated proportion of sample units of unknown eligibility that are eligible.
To calculate the weighted unit response rates the following formula will be applied for each observation i:
Where:
Ci = 1 if the ith case is completed (or is a sufficient partial), and Ci = 0 if the ith case is not completed;
Ri = 1 if the ith case is a refusal and Ri = 0 if the ith case is not a refusal;
NCi = 1 if the ith case is a non-contacted sample unit known to be eligible and NCi = 0 if the ith case is not a non-contacted sample unit known to be eligible;
Oi = 1 if the ith case is a eligible sample units not responding for reasons other than refusal and Oi = 0 if the ith case is not a eligible sample unit not responding for reasons other than refusal;
Ui = 1 if the ith case is a sample units of unknown eligibility and Ui = 0 if the ith case is not a sample unit of unknown eligibility;
e = estimated proportion of sample units of unknown eligibility that are eligible; and
wi = the inverse probability of selection for the ith sample unit.
Because this study involves multiple stages and part of our sample is drawn with probability proportionate to size (PPS), we will calculate the overall unit response rate with the following formula:
Where:
RRUi = the unit level response rate for the ith stage;
C denotes cross-sectional; and
K = the number of stages.
We will also estimate the bias of the sample respondent mean using the following formula:
Where:
= the mean based on all sample cases;
= the mean based only on respondent cases;
= the mean based only on the non-respondent cases;
= the number of cases in the sample; and
= the number of non-respondent cases.
In the analysis of unit nonresponse, we will employ a multivariate modeling of response using respondent and non-respondent frame variables to determine if nonresponse bias exists. We will also compare the respondents to known characteristics of the population from the U.S. Census Bureau in order to investigate possible bias.
The
survey was tested by our local collaborators on 5 randomly selected,
non-commercial resource users within the Mission-Aransas NERR site.
The survey performed very well in that users understood and answered
questions without concern or difficulty. However, in response to this
test, we intend to develop and deploy with the surveyor an answer
sheet with standard responses to anticipated questions about the
survey and individual items. Surveyors will be instructed to respond
to user questions about the need and use of subsistence and income
questions using the answer sheet. Additionally, during outreach
events, we will provide information to the general public about the
need for data being collected, as well as its intended use.
Kristopher M. Huffman, M.Sc.
Statistician
American College of Surgeons
Chicago, IL.
Susan Lovelace, Ph.D. Environmental Social Scientist Human Dimensions Research Program NOAA NOS NCCOS Hollings Marine Laboratory JHT, Inc. 331 Ft. Johnson Rd. Charleston, SC 29412 Email: [email protected] Phone: 843-762-8933
|
Jarrod Loerzel, M.Sc., M.P.A. Environmental Social Scientist Human Dimensions Research Program NOAA NOS NCCOS Hollings Marine Laboratory Email: [email protected] Phone: 843-762-8864
Maria Dillard, M.A. Environmental Social Scientist Human Dimensions Research Program NOAA NOS NCCOS Hollings Marine Laboratory Email: [email protected] Phone: 843-762-8929 |
Susan Lovelace and Jarrod Loerzel will supervise data collection. Data analysis will be completed by Jarrod Loerzel, Susan Lovelace, and Maria Dillard as well as the aforementioned NERR and NMS staff.
1 Dillman, Don A., Jolene D. Smyth and Leah Melani Christian. (2009). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, NJ: John Wiley & Sons, Inc.
2 Ibid.
3 South Carolina Department of Natural Resources. (2011). Ashepoo-Combahee-Edisto (ACE) Basin National Estuarine Research Reserve Management Plan 2011-2016. Charleston, SC: South Carolina Department of Natural Resources.
4 Guana Tolomato Matanzas National Estuarine Research Reserve. (2009). Guana Tolomato Matanzas National Estuarine Research Reserve Management Plan May 2009 - April 2014. Ponte Vedra Beach, FL: Florida Department of Environmental Protection - Coastal and Aquatic Managed Areas.
5 Gaddis, Aimee, Hurley, Dorset, Vallaster, Brooke, VanParreren, Suzanne, Sullivan, Buddy, Mason, Ann, & Howell, Lyndsey. (2008). Sapelo Island National Estuarine Research Reserve Management Plan 2008-2013. Brunswick, GA: Reagin Printing Company.
6 Evans, Anne, Madden, Kiersten, & Morehead-Palmer, Sally (Eds.). (2012). The Ecology and Sociology of the Mission-Aransas Estuary: An Estuarine and Watershed Profile. Port Aransas, TX: University of Texas.
7 NOAA/NCCOS. (2012). Sampling Tool ArcGIS Software Plug-in (Version 10): NOAA/NCCOS.
8 Activity Ranking: 5 = 4+ activities; 4 = 3 activities; 3 = 2 activities; 2 = 1 activities; 1 = 0 activities.
9 Facility Ranking: 5 = 9+ facilities; 4 = 7-8 facilities; 3 = 5-6 facilities; 2 = 3-4 facilities; 1= 1-2 facilities.
10 Dillman, Don A., Jolene D. Smyth and Leah Melani Christian. (2009). Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, NJ: John Wiley & Sons, Inc.
11 Ibid.
12 Dijkstra, Wil, & Smit, Johannes H. (2002). Persuading reluctant recipients in telephone surveys. In R. M. Groves, D. A. Dillman, J. L. Etinge & R. J. A. Little (Eds.), Survey Nonresponse. New York, New York: John Wiley & Sons, Inc.
13 Ibid.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Sarah Brabson |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |