Download:
pdf |
pdfSUPPORTING STATEMENT
SOCIO-ECONOMIC SURVEYS OF VESSEL OWNERS AND CREW IN NEW
ENGLAND AND MID-ATLANTIC FISHERIES
OMB CONTROL NO. 0648-XXXX
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection method to be used. Data on the number of entities
(e.g. establishments, State and local governmental units, households, or persons) in the
universe and the corresponding sample are to be provided in tabular form. The tabulation
must also include expected response rates for the collection as a whole. If the collection has
been conducted before, provide the actual response rate achieved.
Target and Sampling Populations
Table 4 provides the definitions of the target and sampling populations for each survey.
Table 1 - Target and Sampling Population Definitions
Category
Owner Survey
Crew Survey
Target population – The population
that the survey effort is interested in
collecting data about.
• Individuals or entities that own
fishing vessels operating in the
Northeast or Mid-Atlantic states.
• Individuals who work as crew on
commercial fishing vessels
operating in the Northeast or
Mid-Atlantic states.
Sampling population – The set of
individuals from which the sample
units are drawn.
• Individuals or entities whose
names which are listed as vessel
owners.
• Individual crew members that can
be encountered on the public
areas of docks.
Population and Sample Sizes
Both surveys will be stratified by fishery. The set of fisheries that will be used to stratify the
sample is provided in Table 5. Table 5 also provides estimates of the populations and sample
sizes for the first year for both surveys and expected response rates for both surveys. Details on
how sample sizes were estimated are provided under Part B, Question 2 below. The total sample
size for owners survey is targeted to be 769 in the first year and the sample size for the crew
survey is targeted to be 1,330 in the first year. As noted above, SSB will collect the full sample
size in the first year and then (approximately) half of the first-year sample size in the second and
third years. The sample selected each year, however, will be independent of samples collected in
other years (i.e., SSB will not be collecting data from the same individuals over time unless those
individuals are randomly selected in different years). In the second and third years SSB will
collect data from one half of the fisheries in each year. The per-year sample sizes and the
annualized sample size are presented in Table 6.
1
Table 2 - Populations and First Year Sample Sizes, By Fishery, and Expected Response Rate for Owner and
Crew Surveys
Owners
Fishery
Crew
Population
[a]
Sample
Size [b]
Population
[c]
Sample
Size [b]
Black Sea Bass
60
34
506
66
Herring and mackerel
25
19
509
66
506
66
4,229
75
82
40
917
70
55
32
487
65
243
58
3,045
75
20
16
281
60
5
5
143
50
Scallop, general category IFQ
151
51
2,180
75
Scallop, general category non-IFQ
148
50
3,875
75
Scallop, limited access
193
55
5,114
75
Scup
23
18
219
56
Skate
23
18
290
60
Spiny dogfish
45
29
341
62
Squid, Illex
10
9
273
59
Squid, Loligo
42
27
534
66
178
53
1,563
75
Surf clam/ocean quahog
64
35
1,084
71
Tilefish
15
13
132
48
1,245
42
-
-
sector
266
37
-
-
common/other
427
39
3,869
42
50
23
409
39
3,876
769
30,000
1,330
Lobster
Monkfish
Multispecies, large mesh
common/other
sector
Multispecies, small mesh
Red crab
Summer Flounder
Inactive
Non federally managed fishery
common/other
sector
Totals
Expected Response Rate
70%
90%
[a] The population for owners reflects the number of vessels in each fishery. Since owners can own more than one
vessel, this number overestimates the number of owners in the Northeast. Work is currently underway at the
Northeast Fisheries Science Center to develop definitive linkages between vessels and owners. Data from that effort
should be available to use to develop a sampling frame for the first year implementation of this survey. Vessels were
placed in a fishery based on revenues in 2010. If a vessel was inactive in 2010, 2009 revenues were used and if
inactive in 2009 also, 2008 revenues were used. If a vessel was inactive in 2008-2010, then it was placed in an
“inactive” category.
[b] Details on the calculation of sample size can be found under Section B, Question 2 below.
[c] The population of crew for each fishery was estimated by distributing an estimated 30,000 crew in the Northeast
and Mid-Atlantic states across the fisheries based on information on the number of crew required for each vessel.
Attachment A provides details on this estimate.
2
Table 3 - Total Sample Sizes per Year and Annualized Sample Size
Survey Year
Owners Survey
Crew Survey
First year
769
1,330
Second year
385
665
Third year
385
665
Annualized [a]
513
887
[a] Calculated by summing the sample sizes over the three years and dividing by
three.
Response Rates
SSB expects that response for the owners survey to be 70 percent and for the crew survey to be
close to 90 percent. For the owners survey, SSB will use Don Dillman’s Tailored Design Method
(TDM) to mail surveys (Dillman, 1999). The TDM approach involves multiple points of contact
with potential respondents to maximize response rates. SSB’s estimate of 70 percent response is
based on work done by its contractor in which it has obtained response rates of 70 percent or
higher for mail surveys.
SSB’s estimate for the crew survey is based on previous work conducted by Richard Pollnac in
which a 90 percent response rate was achieved in an intercept survey of crew in New England. 1
Sample Selection
To select the sample of owners, SSB will use a systematic sampling approach. Each stratum will
be sorted by the owner’s listed state. Next, SSB will determine the sampling interval by dividing
the total number in each stratum by the sample size to be selected. For example, with a sample of
20 respondents and a population of N owners, the sampling interval would be k = N/20. SSB will
then select a random number between 1 and k which becomes the starting point for the sampling
process. SSB would then select every kth potential respondent beginning at the randomly selected
starting point in the sorted list. For example, if the random number selected as the start point was
3, then SSB would select respondent numbers 3, 3 + k, 3 + 2k, etc. Sorting by the state will allow
for proportional representation of states within the sample.
Respondents for the crew survey will be selected using a cluster sample design. After
stratification, the first selection process will involve randomly selecting ports. To ensure that
“active” ports are selected, SSB will select using a probability proportional to size (PPS)
approach. Specifically, under a PPS approach a port’s probability of being selected into the
sample is related to the “size” of the port with larger ports being more likely to be selected into
the sample. The PPS approach is necessary to ensure that selected ports are more active and thus,
more likely to result in completed crew surveys. For this study, the size of the port should be
measured by some factor that is correlated with the availability of crew at the port. NMFS is
currently reviewing available data to determine the best factor to use. One limiting concern is
that the factor chosen to select upon should not itself be correlated with fisheries (e.g., selection
based on the factor would lead to over-selection of ports that concentrated among a specific set
1
This response rate is based on a project entitled “Job Satisfaction, Well-being and Change in New England Fishing
Communities” coordinated by Richard Pollnac at the University of Rhode Island.
3
of fisheries). Once ports are selected, SSB will place interviewers at ports and crew will be
recruited to take the survey as they are identified.
The number of ports selected in the crew survey will depend on the implementation costs and the
distribution of fisheries by port:
•
Higher implementation costs will lead to fewer ports being selected. One key aspect of
implementation is the rate at which surveys would be completed (e.g., number of
completes per day per interviewer on site). As noted in Section B, question 4, SSB is
conducting a small-scale pilot to better assess implementation costs, logistics, and
completion rates.
•
Uniform distribution of fisheries across ports will lead to fewer ports being selected. In
order to collect data from all fisheries identified in Table 5 above, SSB will need to visit
ports that represent all fisheries. If fisheries tend to uniformly distributed across ports
(i.e., most ports involve most fisheries), then fewer ports would need to be visited.
However, if fisheries tend to be concentrated at port (i.e., some ports concentrate on some
fisheries while other ports concentrate on other fisheries), the more fisheries will need to
be visited.
SSB expect to select between 10 and 20 ports to visit at various times and days as part of this
project, with the exact number to be determined as data on fishery distribution by port are
examined and following the small-scale pilot discussed under Part B, Question 4.
2. Describe the procedures for the collection, including: the statistical methodology for
stratification and sample selection; the estimation procedure; the degree of accuracy
needed for the purpose described in the justification; any unusual problems requiring
specialized sampling procedures; and any use of periodic (less frequent than annual) data
collection cycles to reduce burden.
Sample Size and Accuracy
SSB has selected an unadjusted sample size of 75 units (owners or crew) per stratum for most
fisheries. For some strata where less precise information is needed, SSB relaxed the accuracy
requirements and needs only 42 units for each of those strata. Each of these per stratum values
was adjusted using the finite population correction. The process for developing these per stratum
values is discussed, along with the implications for accuracy, in the remainder of this section.
In setting sample size, three statistical criteria need to be considered:
•
Confidence represents the confidence interval around estimates derived from the sample.
Confidence is generally set at 95 or 90 percent in socio-economics studies. For deriving
sample size estimates, SSB used 90 percent.
•
The power of a statistical test is the probability of correctly rejecting a false hypothesis.
In more practical terms, it is the probability of detecting a change or difference in some
variable in a sample when that change or difference has actually occurred in the
population. SSB used 80 percent power to define a sample size. Setting power at 80
4
percent is rather strong and will increase sample size relative to standard hypothesis
testing. However, standard hypothesis testing sets power at 50 percent by default. Thus,
under a standard hypothesis test, there is only a 50-50 chance of detecting effects within a
sample that have actually occurred in the population. 2
•
Precision (accuracy) concerns the amount of sampling error that one is willing to accept.
With very large samples, one can be fairly certain that estimates derived from the sample
are close to the population values. The key questions from this survey are in terms of five
point scales. For the five point scale questions, each point on the scale is assigned a value
of one to five to transform the scale into numeric value. The five point scale questions are
almost all part of groups of questions that together form an index. The indices are the key
pieces of information with respect to the five point scales and thus, precision should be
set in terms of the indices. Each index varies in terms of the number of components (i.e.,
five point scale questions that comprise it). To account for this, SSB performed sample
size calculations for detecting changes in averaged index values over time. 3
Another consideration is the type of comparisons that are being made. SSB will be tracking
trends within a fishery by comparing one data collection to others. This has two implications.
First, SSB used sample size formulas that reflect comparing one sample to another. Second, SSB
set levels of precision at the fishery level.
In settling on a sample size, SSB considered a series of tabulations that provided estimated
sample sizes for various levels of accuracy. The tabulations were based on Jacob Cohen’s (1988)
power analysis calculations for sample size for detecting difference in mean values between two
samples. 4 The formula used to calculate potential sample size was derived from Cohen’s book:
where n0 is the initial sample size, 1,237 is a value derived from Table 2.4 in Cohen’s book, and
d is the difference between the two means divided by the standard deviation. In order to calculate
sample sizes for the index questions it is necessary to have an estimate of the standard deviation
for the indices to use in the value d. SSB was provided with data from researchers at East
Carolina University for similar five point indices. These data are presented in Attachment B. The
data in Attachment B reflect two indices, both comprised of nine questions. When the indices are
divided by the number of components (nine in each case), the standard deviations for the two
indices are 0.646 and 0.7. For calculating sample sizes, SSB used a standard deviation of 0.9 to
be conservative.
Table 7 provides samples sizes for five levels of accuracy. Accuracy is defined as the difference
between the averaged index value between two implementations of the survey (e.g., between
year one and year two). For example, to have an 80 percent change of detecting a 0.2 point
2
Using power above 50 percent necessitates the use of power analysis to set sample sizes (Cohen, 1988).
The averaged index value is the index value divided by the number of questions in the index. For example, an
index comprised of eight five point scale questions can take on values that range from 8 to 40 for each respondent.
Dividing by eight provides an average value for this index and transforms the index back to a range of 1 to 5.
4
Jacob Cohen, 1988. Statistical power analysis for the behavioral sciences, Lawrence Erlbaum Associates, Chapter
2.
3
5
difference on a five point scale between two implementations of the survey, assuming the change
actually occurred in the population, would require selecting 252 units from each stratum. 5
Table 4 - Per Stratum Sample Sizes for
Detecting Various Changes in a Five Point
Scale
Difference in mean
Sample size per
index value
stratum
0.1
1,003
0.2
252
0.3
113
0.4
64
0.5
42
In selecting 75 units per stratum SSB considered both cost (number of sample units needed) and
the accuracy that could be obtained. In short, SSB considered a value of 75 units as an acceptable
balance between cost and accuracy. Inverting the above formula for n0 = 75 resulted in an
estimated precision of 0.36 – 037 units on five point scale. Thus, with these sample sizes, there is
an 80 percent chance that a change of 0.36 – 0.37 units on a five point scale in the sample (for a
specific fishery) will be detected as a statistically significant change if the change actually
occurred in the population.
As noted above, however, less precise information is needed for some strata. These included
owners in the “inactive” stratum and owners and crew in the non-Federally managed fisheries.
SSB determined that data were needed from these categories, but not at the level of precision
needed for other fisheries. For these, SSB determined that a precision of 0.5 units on the five
points scale was sufficient.
Finally, SSB adjusted these sample size estimates using the finite population correction (FPC).
The FPC is defined as:
where N is the population. The FPC was applied to any stratum where the sample size exceeded
five percent of the population. The FPC-adjusted sample sizes for each stratum appear in Table
5.
Unusual Problems
No unusual problems are expected to be encountered.
5
As noted above, these values are based on the above formula using 0.9 as the standard deviation in the value d.
6
Use of Periodic Collection Cycle
As noted in Section A, question 5, the survey will involve collecting data from all fisheries in the
first year and then collecting data from half of the fisheries every other year. Thus, all fisheries
will have data collected at least every other year following the first. SSB may collect annual data
from fisheries where more frequent data would be needed to support policy decision (e.g.,
fisheries that may be considered “priority”). However, for most fisheries, data would only be
collected every other year following the first year.
3. Describe the methods used to maximize response rates and to deal with nonresponse.
The accuracy and reliability of the information collected must be shown to be adequate for
the intended uses. For collections based on sampling, a special justification must be
provided if they will not yield "reliable" data that can be generalized to the universe
studied.
Both surveys
For both surveys, SSB has employed the following practices to maximize response rate:
•
Survey length—SSB has limited the length of the survey to ensure it can be completed in
a reasonable amount of time.
•
Best-practices design—SSB has employed an expert survey firm that employs best
practices in survey design. These best practices take into account question sequencing,
wording, and graphic elements on the survey.
Owners survey
To maximize response rate in the owners survey, SSB will use Dillman’s TDM. The TDM in a
mail survey context involves multiple points of contact with potential respondents to improve
response. The following procedure will be used in the owners survey:
•
Pre-notification letter—Each owner selected as part of the sample will be sent a prenotification letter to inform them of the upcoming survey. The letter will explain the need
for the survey and how responding to the survey will provide valuable information to
NOAA.
•
Survey mail-out—One week following the pre-notification letter, each owner selected as
part of the sample will receive a version of survey instrument and a cover letter. The
cover letter will explain the importance of the survey and how responding to the survey
will provide valuable information to NOAA. The mail-out package will also contain a
self-addressed stamped envelope (SASE) for returning the survey to SSB.
•
Reminder postcard—Approximately 1-2 weeks following the first survey mail-out a
reminder postcard will be sent to those that have not responded. The postcard will
provide contact information (phone and email) to respondents to get a replacement copy
of the instrument if needed.
7
•
Replacement survey mail-out—Approximately two weeks following the reminder
postcard, SSB will mail out a second version of the survey (with a SASE) to those that
have not responded. The survey will arrive with a cover letter explaining that a second
version is being provided to ensure the survey was not lost and once again stress the
importance of responding.
Following these steps, SSB will determine the number of replacements that need to be selected
from the sample. The replacement would replace those that have not responded within two
weeks of receiving the replacement survey mail-out.
In addition to the pre-notification letter, SSB also plans to perform outreach regarding the
survey. This will include advertising the survey in local publications (e.g., Commercial Fishing
News) and writing a guest editorial in Commercial Fishing News that describes the value of
responding to the survey.
Dealing with Nonresponse in the owners survey
As noted under Section B, Question 1, SSB expects that the owners survey will have a response
rate of 70 percent. In order to ensure that the resulting data are not biased due to nonresponse,
SSB will perform a nonresponse analysis. The analysis will include comparing the data collected
through the survey to previously collected data. SSB will compare the data collected under this
effort to three sets of the available data:
6
7
•
SSB collected data on owners and crew in 2000 that included demographic information
on the owners. 6 That survey resulted in a response rate of 78 percent. SSB can use those
data to assess the extent to which the sample that responded was significantly different
from those that responded to the 2000 survey effort.
•
The sampling frame will be constructed from data maintained by NOAA’s Northeast
Fisheries Science Center. These data have information on boat size, permits, and home
ports. SSB can use these data to compare to the data that are collected though the survey
to assess whether the sample that responded were significantly different from those that
did not respond.
•
The Gulf of Maine Research Institute (GMRI) has performed a number of surveys and
other research projects that have involved collecting data on socioeconomics aspects
related to fisheries management. 7 Some of these surveys contain information on
demographics related to owners that SSB can use to assess whether the sample that
responded were significantly different from the sample that responded to the GMRI
research projects.
http://www.nefsc.noaa.gov/nefsc/publications/tm/tm164/tm164.pdf.
http://gmri.org/community/display.asp?a=5&b=16&c=171.
8
Crew survey
The crew survey will be implemented as an intercept approach where interviewers will intercept
crew at the docks. A random intercept survey is being used to maximize response rates and is a
method used for studies of hard-to-find individuals (Miller et.al., 1997) such as crew, who may
not have a permanent address or phone number or may live aboard the vessel on which they
work (Kitner, 2006). A study similar to this one involved a 90 percent response rate from 350
fishermen New England in 2009 and 2010. 8
To improve response rates, surveys will be conducted in-person when possible. Face-to-face
interviews are an effective method for the collection of information from people such as illiterate
individuals who may not be able to participate using other methods (Bernard, 2006:256). Faceto- face interviews also make it possible to probe for more in-depth answers and clarify
respondent questions (Bernard, 2006:256). In addition, the individuals participating in the
research have the opportunity to communicate with the researcher and provide additional
information that is useful to the overall objectives of the study. If more than one crew member is
available and willing to take the survey, then the interviewer may hand out the survey with a clip
board and pen and wait for the respondents to take the survey, answering questions if needed.
Prior to the implementation of the survey, interviewers will explain that the survey is
anonymous, participation is voluntary and that the interview can be stopped at any point. It will
also be explained that participants can skip questions they do not want to answer.
4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as
effective means to refine collections, but if ten or more test respondents are involved OMB
must give prior approval.
SSB has a contract in place to perform small (less than nine respondents) scale pilots of the
methods and instruments involved in this data collection. Table 8 provides a summary of these
potential pilots.
Table 5 - Pilots Being Conducted to Assess Methods and Survey Instrument
Pilot
Description
Objectives
• Determine the time it would take to
The crew survey will be implemented
complete the survey using the intercept,
as an intercept survey where
read-out approach.
interviewers travel to randomly
selected ports to find crew. Thus, there
• Assess how well the intercept approach
is a need to understand how well this
may work for identifying and completing
approach would work and to develop
surveys, including identifying any best
Crew survey –
information that can be used in
practices or lessons learned.
interviews with crew at
estimating the cost of collecting data in
• Assess how well the survey questions will
ports
this manner. This pilot will involve
work in the field.
collecting up to nine responses from
• Develop information that can be used to
crew at two ports selected for
estimate costs for full implementation,
convenience. SSB’s subcontractor will
including:
go through the survey with each
o Completion rate per day
respondent and then ask a set of followo Time to complete each survey
on questions.
8
See footnote 1 above.
9
Pilot
Crew survey –
interviews with port
agents and harbor
commissioners
Description
As noted above, the crew survey will
be an intercept approach. Thus, there is
a need to understand the most effective
way to implement this approach. To
increase our understanding, SSB’s
subcontractor will perform a series of
interviews with harbor commissioner
and port agents. These interviews will
focus on implementation issues related
to the crew survey.
Objectives
• Determine the best times of the year and
day to perform an intercept survey of crew
at ports.
• Explore possible implementation issues
that may arise in (1) getting access to
ports and (2) identifying and recruiting
crew to take part.
•
Owners survey
The owner survey will be a mail
survey. The owner pilot will be used to
assess the survey questions and the
extent to which anonymity will be an
issue for response rates. To stay within
PRA requirements, SSB’s
subcontractor will interview nine or
fewer ship owners.
•
•
Assess how well the survey questions will
work when implemented by discussing
the questions with owners.
Assess whether a lack of anonymity to
NMFS would lead to reduced response
from ship owners.
Assess how well a mail survey would
work among owners, including whether
(1) the appropriate owner to answer the
questions (i.e., a decision maker) would
be reached by a mail survey and (2)
owners would be available and willing to
answer a mail survey.
5. Provide the name and telephone number of individuals consulted on the statistical
aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other
person(s) who will actually collect and/or analyze the information for the agency.
SSB has contracted with the following to develop and review the survey. SSB has made no
determination at this point on who would be involved in collecting and analyzing the data
outside of SSB staff.
Name and Affiliation
Lou Nadeau,
Eastern Research Group, Inc.
David Loomis,
East Carolina University
Richard Pollnac
University of Rhode Island
Phone
781-674-7316
Email
[email protected]
252-737-4263
[email protected]
401-874-5107
[email protected]
REFERENCES
Bernard, H. Russell. 2006. Research Methods in Anthropology: Qualitative and Quantitative
Approaches Altamira Press, New York.
Cohen, Jacob, 1988. Statistical Power Analysis for the Behavioral Sciences, 2nd Edition,
Lawrence Earlbaum Associates Publishers, Hillsdale, N.J.
Dillman, Don, 1999. Mail and Internet Surveys: The Tailored Design Method, John Wiley and
Sons, New York.
10
Kitner, Kathi. 2006. Beeliners, Pinkies, and Kitties: Mobility and Marginalization in the South
Atlantic Snapper Grouper Fishery. Human Organization 65(3): 294 – 306.
Miller, K.W., L.B. Wilder, F.A. Stillman, D.M. Becker (1997) The feasibility of a street intercept
survey method in an African-American community. American Journal of Public Health
87(4): 665-658.
Pollnac, Richard B., Susan Abbott-Jamieson, Courtland Smith, Marc L. Miller, Patricia M. Clay,
and Bryan Oles. 2006[2008]. Toward a Model for Fisheries Social Impact Assessment. Marine
Fisheries Review 68(1-4):1-18.
11
Attachment A
Estimated Crew Population
The total crew population in the Northeast Region is estimated to be 30,000. This number is
derived from previous work that SSB has done with IMPLAN (Minnesota IMPLAN Group,
2008 IMPLAN System (data and software), 1725 Tower Drive West Suite 140, Stillwater, MN
55082 www.implan.com) . Although a total number is available, the number per fishery is not.
SSB used data on vessels and crew per vessel to develop a set of percentages to allocate the
30,000 total. First, SSB used data on the value of fish caught to assign vessels to fisheries. A
vessel was assigned to a fishery based on its value of total catch in 2010. If the vessel was
inactive in 2010, then 2009 data were used and if inactive in 2009 also, then 2008 data were
used. A vessel inactive from 2008-2010 was placed in the inactive category. These per-fishery
vessel numbers appear in Table A-1. A total crew based on the data available to SSB was used to
calculate a number of crew in each fishery to calculate a percentage for each fishery. These data
reflect average crew sizes for the different fisheries, but cannot be used in estimating total crew
population for a fishery. These data are also in Table A-1. These crew numbers were then
converted to a percent distribution (Table A-1) and then the 30,000 total was allocated across the
fisheries using this percent distribution (last column Table A-1).
12
Table A-1. Calculation of Crew Population By Fishery
Black Sea Bass
60
Total crew
number
for use in
allocation
115
1.7%
Crew population
(30,000 total
allocated by
percentages)
506
Herring and mackerel
25
116
1.7%
509
506
963
14.1%
4,229
82
209
3.1%
917
55
111
1.6%
487
243
693
10.2%
3,045
Multispecies, small mesh
20
64
0.9%
281
Red crab
Scallop, general category
IFQ
Scallop, general category
non-IFQ
Scallop, limited access
5
32
0.5%
143
151
496
7.3%
2,180
148
882
12.9%
3,875
193
1,165
17.0%
5,114
Scup
23
50
0.7%
219
Skate
23
66
1.0%
290
Spiny dogfish
45
78
1.1%
341
Squid, Illex
10
62
0.9%
273
Number of
Vessels
Fishery
Lobster
Monkfish
Multispecies, large mesh
common/other
sector
Squid, Loligo
Crew,
percent of
total
42
122
1.8%
534
178
356
5.2%
1,563
Surf clam/ocean qhahog
64
247
3.6%
1,084
Tilefish
15
30
0.4%
132
Summer Flounder
Inactive
Non federally managed
fishery
common/other
1,245
Not available
-
sector
266
Not available
-
common/other
427
881
12.9%
3,869
50
93
1.4%
409
3,876
6,832
100.0%
30,000
sector
Total
13
Attachment B
Data Provided by East Carolina University For
Estimating Variance of a Five-Point Scale
Background
The following presents two example indexes based on real data. The subjects in the study were
SCUBA divers and snorkelers. The first index is based on specialization theory, and the second
is based on mediated interaction (which is basically the extent to which a person makes use of
various sources of information). Each scale consists of nine items, each item with five possible
responses (5-point Likert type scale). The individual items, means and standard deviations (SDs)
are provided in each table. The nine individual items are then summed into a cumulative index
ranging from 9 to 45, with the mean and SD of that index provided in each table. Finally the
cumulative index for each is segmented into five levels (the final index; i.e., converted back to a
five point scale), with the mean and SD of that index provided.
Many of the variances for the individual items are below 1.2. The median variance among this
set of items is 1.22. Thus, 1.22 may be a good estimate of variance for this study. However, there
are several that exceed 1.2. Additionally, if no items are correlated with one another, then the
variance of the index would be simply the sum of the variances. However, in each index below if
we sum the variances of the individual items; it is 3-4 times lower than that index variance. Thus,
to adjust for this in this study, ERG multiplied the assumed variance of 1.2 by eight (our
assumed index item size) and then inflated by a factor of 3.5 to adjust for inter-item correlations.
SCUBA Diver Information Index:
Information Items
N
Min.
Max.
Mean
Item 1
Item 2
Item 3
Item 4
Item 5
Item 6
Item 7
Item 8
Item 9
965
955
954
951
940
955
943
955
953
1
1
1
1
1
1
1
1
1
5
5
5
5
5
5
5
5
5
Cumulative Index
Final Index (five
subgroups)
917
917
9
1
45
5
14
Var
4
3.19
1.98
2.15
1.7
3.9
1.73
1.73
1.39
Std.
Dev.
1.114
1.272
1.099
1.145
0.966
1.121
1.194
1.049
0.801
21.71
2.41
5.817
0.646
33.84
0.42
1.24
1.62
1.21
1.31
0.93
1.26
1.43
1.10
0.64
Snorkeler Information Index:
Information Items
N
Min.
Max.
Mean
Item 1
Item 2
Item 3
Item 4
Item 5
Item 6
Item 7
Item 8
Item 9
598
592
595
589
582
586
585
593
592
1
1
1
1
1
1
1
1
1
5
5
5
5
5
5
5
5
5
Cumulative Index
Final Index (five
subgroups)
561
561
9
1
45
5
15
Var.
3.23
1.73
1.84
1.87
1.73
2.74
1.16
1.6
1.34
Std.
Dev.
1.451
1.082
1.133
1.172
1.054
1.439
0.548
1.005
0.777
17.09
1.9
6.297
0.7
39.65
0.49
2.11
1.17
1.28
1.37
1.11
2.07
0.30
1.01
0.60
File Type | application/pdf |
File Title | SUPPORTING STATEMENT |
Author | Richard Roberts |
File Modified | 2011-07-09 |
File Created | 2011-07-09 |