0648-0606 Supporting Statement B

0648-0606 Supporting Statement B.docx

Pacific Coast Groundfish Fishery Rationalization Social Study

OMB: 0648-0606

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

National Oceanic & Atmospheric Administration

Pacific Coast Groundfish Fishery Rationalization Social Study

OMB Control No. 0648-0606


SUPPORTING STATEMENT PART B -


Agencies are instructed to complete Supporting Statement Part B if they are using statistical methods, such as sampling, imputation, or other statistical estimation techniques; most research collections or program evaluations should also complete Part B. If an agency is planning to conduct a sample survey as part of its information collection, Part B of the ICR supporting statement must be completed, and an agency should also complete relevant portions of Part B when conducting a census survey (collections that are sent to the entire universe or population under study). For example, an agency doing a census of a small, well-defined population may not need to describe sampling procedures requested in Part B, but it should address what pretesting has taken place, what its data collection procedures are, how it will maximize response rates, and how it will deal with missing unit and item data.

Agencies conducting qualitative research studies or program evaluations, including case studies or focus groups, should also complete the relevant sections of Part B to provide a more complete description of the use of the information and the methods for collecting the information.

B. Collections of Information Employing Statistical Methods

  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection


This is a renewal request for an ongoing study.


The respondent universe for this study includes those individuals, partners, businesses, etc., that have any connection to the Pacific Coast Groundfish and Whiting Fisheries. These are the fisheries that have been rationalized. Types of respondents expected include fishermen, vessel owners, vessel operators, former groundfish limited entry permit owners, groundfish quota share permit owners/holders, quota allocation recipients, crew aboard groundfish/whiting vessels, mothership operations, catcher-processor operations, shoreside processors, any other at-sea processors, first receivers/buyers, observers, and other individuals who are stakeholders in the fishery such as partners or spouses. In addition, the survey/interview pool will include any businesses that are directly tied to the groundfish/whiting communities through the supply of commercial items to include, but are not limited to net suppliers, fuel suppliers, equipment suppliers, etc. As this fishery has progressed, a sector of the fixed gear fishery has purchased quota to target sablefish. These individuals will also be included within the survey universe, and has resulted in the increased target numbers for the latter survey years.



The survey will be a census of the groundfish trawl/fixed gear fishery as described; that is, all individuals who meet the descriptions above. This methodology is utilized as they are key participants in this fishery and are not documented in any database. Yet, the type of management of this fishery (catch shares) directly impacts their communities, businesses, and families. A random sample is not possible, as no data is available to fully identify the sample. The only known information are the vessel owners, previous limited entry permit holders, quota allocation recipients, and previous study participants as this is the only information tracked. Some of this information is challenging. Some ownership data is listed as LLC’s, or law firms, not allowing for representation of an individual such as an operator or crew for a specific business. Additional efforts are required to pursue full representation of all vessels and businesses who participate in this fishery. As a result, all other counts of the number of respondents are estimates. Calculations have been developed to estimate the number of respondents. Values for these calculations come from a combination of published data, previous data collection efforts (2010, 2012, 2015/2016), and information from personal communications. The Northwest Regional Office provides several tables to include IFQ Vessel Accounts which show the vessel names and the vessel owners names (sometimes businesses), the Quota Share Permit Owners, and a list of IFQ First Receiver Site Licenses which show the processors whom hold those site licenses. Information such as the estimates of how many crew are on vessels came from personal communications during the pilot/study review process with NMFS employees and industry members, continued communication with observers, the NWFSC EDC data collection estimates, as well as from participant observation and inquiries during previous data collection efforts. The combination of this information is utilized to estimate the number of crew on participating vessels that will be part of the survey respondent group (See Table 1). This is believed to be the most accurate process to estimate the field of respondents.


Another sector of the study population, processors, is not as clear as the vessel owners and permit holders information. Initially these estimates were based primarily on the literature. These new estimates are adjusted and account for experience and observations during the prior data collection efforts (See Table 1). Please note that the processing sector has been, and continues to be, very difficult to access and unwilling to participate in any large number. The Pacific Fishery Management Council documents describe shoreside processors for both the trawl groundfish species and whiting species in the (PFMC and NMFS 2009). The data identified a list of possible processors on the west coast. For this research, the processors were confirmed and assumptions remain that each processor has at least one owner, if not multiple owners. An approach to contact the owner(s) of each processor and conduct a census of the processor owner population will be taken.








This space left intentionally blank to support table on next page









Table 1. Response estimates by role to include companies, individuals, and potential meetings.


Description

No.

Companies

No. of Estimated Respondents

Estimated

Response Rate 60%+

Vessel Owners/ Quota Share Permit Holders


200

128

Prior Permit Owners Only


20*

13

Crew Estimateº


150

45

Shoreside Processors Owners (CA, OR, & WA)

71

90

27

Shoreside Processors Employees (CA,OR &WA)

71

50

15

Industry Supply Company Owners and Employees

15

30

18

Misc. Fishermen/Processors – Interviews ONLY


60

36

Observers


50

30

Fishery Related Organizations – Meetings

10

10

6

Misc Others


60

36

Total


720

349


*Some vessels and permits are co-owned, but both owner names are not listed in the permit data, so additional respondents were added to account for vessels with more than one boat owner.

As previously indicated, these numbers have been updated based on the response types from the previous data collections and they include a few adjustments for new entries to the fishery.

ºCrew estimates have been adjusted downwards based on knowledge from the prior data collection efforts from 2010, 2012 and including preliminary observations from the 2015/2016 data collection effort. Due to the nature of the program consolidating the fishery, fewer crew are working, there is high turnover in crew, crew are hard to track down for participation, or have a higher rate of decline to participate. Additionally, the requirement of observers on board reduces the number of crew on smaller vessels, reducing the crew count. A new study is currently being designed specifically to target crew.

+ An average response rate was calculated as 60%. For the vessel owner/quota share permit holders of the estimated respondents (200) a 64% response rate is estimated. For the crew and processing sector respondents a 30% response rate is estimated. Personal communications and experience during prior data collection efforts suggest access to processor sector personnel and crew is increasingly difficult. As a result, a lower response rate is projected from this pool of respondents.


Multiple sub-populations of the study, where no list of individuals exists, are that of vessel crew, processor employees, suppliers, service providers, and spouses that are active in the fishery. Access to these individuals will be sought through various means. First, contact information from previous data collection efforts will be checked to see if it is still current. Crew have a tendency to move and change contact information frequently. Where information has changed or new personnel have entered the fishery, we will initiate previous methodologies to contact crew and processor employees. We will contact vessel owners, quota share permit owners, and processor owners and ask for lists of employees and/or for permission to contact their employees. In past data collection efforts we had some success working with the NMFS Observer and Survey programs as key informants to reach crew aboard vessels. We will continue to strengthen this connection, as observers do change over time as well. There are various community organizations related to this fishery; for example, the Newport Fishermen’s Wives, Inc. We will continue to work closely with these organizations to reach members whom are fishermen and processor employees. Working with all these individuals has helped increase participation in our research, knowledge of our research in local communities, and strengthen working relationships with community members as they collaborate with us to support our research efforts. All individuals who complete the survey/interview process will be shown the compiled lists and asked if there are other crew/staff not listed. These methods have increased access to applicable participants in the past and will be pursued in the future.


Prior efforts over three data collections in 2010, 2012 and 2015/2016 have yielded response rates of 72%, 57%, and 59%, respectively (Table 2). Per the literature, these are comparable and acceptable response rates (Anseel 2010; Baruch 1999; Carothers 2013; Himes-Cornell 2015; Sivo 2006) many fields of study conducting surveys discuss the lack of reporting of response rates at all (Baruch 1999; Sivo 2006). Non response categories were tracked and identified in Table 3. We were able to further code the text from the ‘Other’ category in Table 4 and find some consistent non-response categories. Those categories derived from the ‘Other’ write in option are identified in Table 4.


Table 2. Number of participants by response type (survey only, interview only, or both), and survey year and response rates. “Total survey” sample sizes are equal to the sum of “survey and interview” and “survey only.” “Total interview” is equal to the sum of “survey and interview” and “interview only.”



2010

2012

2016

Survey and interview

196

230

262

Survey only

43

24

13

Interview only

33

33

21

Total survey

239

254

275

Total interview

229

263

283

Total overall participation

272

287

296

Total Participants Targeted

379

500

501

Response Rates

72%

57%

59.%


Table 3. Non-response categories for all three survey years, by percentage.

Reason

Rate

 

2010

2012

2016

Left messages, No return response

34.2%

36.1%

25.1%

Unable to contact due to bad information1

3.4%

16.6%

25.6%

Agreed to participate but unable to arrange

8.5%

13.7%

11.4%

Not applicable to study

-

9.8%

3.8%

Surveys not returned

31.6%

7.3%

3.8%

Immediate decline – Multiple reasons

3.4%

5.4%

12.8%

Immediate decline – No reason

7.7%

2.4%

0.9%

Health Condition Prohibitive/Deceased

0.9%

2.9%

5.7%

Other*

10.3%

5.9%

10.9%







Table 4. Percentage of non-response descriptions that were initially categorized as ‘other’ in Table 3 and further analyzed below.


Reasons

2010

2012

2015/2016

Exit Fishery

-

-

10.0%

Retire

-

-

10.0%

Too Busy

10%

-

13.3%

Participating in a different Fishery

-

15.80%

16.7%

Not Interested

40%

21.10%

23.3%

Other Misc.

50%

63.20%

26.7%


  1. Describe the procedures for the collection of information including:

This data collection will be a continuation of prior efforts and the potential 5th request for reauthorization for an ongoing 5-year rotation of data collection. The initial data collection commenced in 2010 and followed a new Catch Shares Management system. The West Coast Trawl Fishery Catch Shares Program was implemented in 2011. The design elements of the management system guided data collection. After all design elements were completed in 2016, the survey administration defaulted to a 5 year rotation.

This data collection is a census of all individuals who have an active role in the Pacific Coast Groundfish Trawl Fishery. These roles have been clarified and defined in Supporting Statement A. All individuals who meet the study criteria are provided an opportunity to participate in this research. No sample selection or other statistical representation of the study population and their associated statistics is used. The only known population is that of Vessel Owners and Quota Owners. This information is obtained at the beginning of each data collection effort from NOAA’s West Coast Regional Office, Sustainable Fisheries Division, who manage the catch shares program. This is complicated data as businesses can list themselves as owners of both vessels and quota, while others are listed as individuals. As a result, a clear representative sample is difficult to determine, yet we conduct an analysis across all years to try and understand representation. This is reported in general methodology publications and on our study website. All other information is clearly identified using sample size information per the use of the data. For example, in a publication, presentations, and Council reports. This study collects both qualitative and quantitative data and sample size information is reported for both.

Data collection occurs primarily through in-person survey administration and semi-unstructured interviews. Researchers live in and close to the community’s they are researching, and frequently visit them during the study period. Researchers will discuss the research with study participants, administer the surveys, conduct any interviews, and be available to answer any questions. They code the surveys for anonymity and confidentiality, and collect all the surveys upon completion. In the event individuals are unavailable to meet in person, various options are available to participate in the study. Hard copy surveys can be provided either in-person or via mail. Electronic versions of the survey are available for distribution via email, or can be downloaded from the study website. In the event of any mailing costs to return the survey, postage paid envelopes are provided to return the surveys.

The ability to collect this data in-person, is critical to the success of the study and the ability to reach members of the industry that are not clearly identified through any recorded databases. The in-person methods of communication, referrals, and dock contacts, all contribute to increased participation, and increased representation.

A 60% response rate is expected for most of the survey subgroups and is considered a comparable response rate for this population (Carothers 2013; Himes-Cornell et. al. 2015). This is based on a similar response rate obtained for each of the prior data collections, as well as equivalent surveys conducted by the same principal investigator (Russell and Ruff 2014, Russell et. al 2014, Russell, Van Oostenburg, and Vizek 2008). Analysis of the results will be conducted to include response rates for each question. This is an important aspect of the research as the option to skip questions is provided as an additional layer of confidentiality. The strength and accuracy of each piece of data will therefore be represented through the response rate of the question, in addition to the overall response rates.

Return respondents are identified and updated with each subsequent data collection. Individuals who have participated in a prior data collection effort and sorted, and broken down into categories of participation. These categories may include those who have participated in all data collection efforts, or just the last two, as they may be new entrants. Statistical analysis may be used on this sub-population of respondents to provide representation of change over time. These analyses are then compared to the entire data set, or to a sub-analysis by role or community to compare differences. Typical statistics applied include descriptive statistics, Cochran’s Q test to analyze differences between years for dichotomous response variables, an extension of the chi-squared test for three or more years (McNemar 1949). We utilize Friedman’s test to analyze the difference between years for ordinal response variables, as this is a non-parametric extension of repeated measures and allows for comparison of three or more repeated measurements (Sheldon, Fillyaw, and Thomson 1996).

Non-response calculations are carefully tabulated and further informed by detailed record keeping. We are able to inform our non-response by tracking the reasons why participants do not respond in several categories, and report on those categories and others that may trend highly. For example in our 2015/2016 data collection effort, the non-response category for health/death dramatically increased our non-response rate. These calculations are published along with other response rate calculations in our technical documents and on our web page.

Data collection has moved to a 5 year cycle after initial data collection efforts in 2010, 2012, and 2015/2016. Funding was not secured to execute the first 5 year cycle in 2021 and was interrupted by Covid-19. Additional efforts to pursue the 5 year cycle will be continued. This will still allow for the tracking of the socio-cultural impacts of this catch shares program, but will allow time to measure these changes, and reduce the burden on study participants. No changes to the former survey or methodology are planned as this is an ongoing collection and no collection was able to happen over the last authorization due to funding limitations and Covid-19 limitations.

Catch share programs have long been documented to have great social impacts in fishing communities (McCay 1995, Olson 2011, Macinko 1997). Many studies that have been conducted on communities that have been rationalized have occurred post rationalization (Olson 2011, Dewees 1998, Carothers 2015). This study began pre-rationalization, and continues post-rationalization. It also builds upon the methodologies used in a combination of multiple studies used in the past (Dewees 1998, Pollnac and Poggie 1988). All of these studies used a combination of what resources they had available to them to access stakeholders, and the social science tools at their disposal (Coleman et al 2019, Russell and Ruff 2014). While some fisheries are more heavily documented, such as crew have licensing in Alaska, scientists may have more access to contact information in those circumstances that don’t exist elsewhere (Acheson 2001, Coleman et al 2019). What has been consistent over the literature and duration of studies of catch share programs, is the study of multiple individuals in communities beyond just owners (Coleman et al 2019, Olson 2011, Himes et al 2014, Russel and Ruff 2014).

  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Various steps have been, and will continue to be, taken to maximize response rates.


As a reminder, no statistical sampling methodology is intended for this study population, there is no specific sampling frame applied in this case.


The first step to increase response rates has been taken in the form of working with industry members in a pilot study and providing the opportunity for them to review and contribute to the development of the survey tool. Industry members selected are all key participants in various aspects of the industry, to include geographically diverse locations within the fishery, diverse roles within the industry, as well as diverse knowledge of the fishery. Each industry member has been invited to continue to work with the study principal investigator to discuss the best approach to reach study participants. Several of the industry members are committed to serving as key informants, gate keepers, and primary contacts to many others in the industry. These individuals assist in the communication of the research, has access to literature about the study to be distributed to their constituents, and assists researchers in the field to coordinate with study participants. The action of working with industry members and including them in the survey design and study and points of contact has increased the response rate dramatically. Communication with these individuals has continued throughout the history of the study and relationships have been built to gauge and understand our study population as we enter the field.


Additional efforts to increase response rate include in-person survey administration whenever possible. It has been the experience of other research efforts that conducting the research in person and collecting completed surveys immediately, dramatically increases response rates (Rea and Parker 1997; Robson 2002; Russell and Schneidler-Ruff 2014). In addition, the individuals participating in the research have the opportunity to communicate with the researcher and provide additional information that is of concern to them to be included in the data set.


Contact has also been made with other key members of, NMFS, academia, and industry to better understand the study universe and to work together to collect a more complete data set. Communication with NMFS Northwest Regional Office (Frank Lockhart, Aja Szumylo, Abigail Harley), NMFS Observer program personnel (John McVeigh, John Lafargue), NMFS survey program personnel (Aimee Keller, Victor Simon), NMFS Alaska Fisheries Science Center personnel (Steve Kasperski, Alan Haynie), NMFS SWFSC personnel (Rosemary Kosaka), other NMFS personnel (Todd Lee, Carl Lian, Karma Norman, Dan Holland), Pacific Fishery Management Council Staff (Jim Seger, Kit Dahl, and Jennifer Gilden), Oregon Sea Grant Personnel (Flaxen Conway, Jamie Doyle), and California Sea Grant, UCSB Academic Staff (Carrie Pomeroy), and Oregon State University Academic personnel (Flaxen Conway and Lori Cramer) are included in collaborative efforts of this research. These efforts have increased the background knowledge available to the researchers, provided additional key informants and gate keepers to the industry, and have provided a support network throughout the West Coast to conduct this research. This network of information available to the researchers has contributed to an increased response rate. An example of how this will work is through coordinating our approach of fishermen with observers. This coordination will serve two functions, 1) access to vessel schedules, and 2) gate keeper assistance. The observers work with fishermen on a daily basis, they will have knowledge of the boats schedules, which would direct the researchers to be available to conduct the research at the most appropriate times for the survey respondents. It would reduce the contact burden and extensive scheduling calls, and capture the targeted respondents when they are most available. In addition, the observers know the individuals of research interest personally. Collaborating with the observers and arranging for introductions between researchers and study participations by the observers, will likely increase the willingness of study participants to work with researchers.


Additionally, as this research effort has been underway for multiple years, we have established good relationships with community members by maintaining high levels of communication with them and working with them in their communities. We return to communities with results, participate in workshops requesting feedback of posters and preliminary results, and have increased local community knowledge of our research. As a result, during data collection efforts, community members are willing to participate and further support efforts to recruit other community members for inclusion in the study, increasing participation rates.


Multiple options will be provided to study participants to participate in the research. For individuals who are willing to work with us but don’t want to fill out the survey, researchers will conduct an interview and complete the survey at their discretion. For those who don’t want to complete the entire survey, a section completion guide directs the participants as to which sections are most important to complete for the role the individual plays in the industry, limiting the sections the participant needs to complete. It is also clearly communicated that the individuals can stop their participation at any time, stop the completion of the survey at any time, or skip any questions of concern at any time, without any personal consequence. For those individuals who are not interested in the survey at all but are willing to participate in an interview, researchers will limit their data collection to interviews. If a participant is willing to give us only a few minutes of their time, we will ask the questions outlined in Sections A and B of the survey instrument. These sections are estimated to take approximately 5 minutes to complete. These responses will be used to analyze non-response bias. We also have printed surveys available to drop off and either pick back up from participants, or return via a pre-paid return envelope. If participants prefer, they can complete the survey electronically via the web or a PDF emailed version. The survey has also been translated, and is available in multiple languages upon request. This is not widely distributed to control costs and based on prior experience of low response rates, but made available for anyone who wishes.


This data collection has been, and continues to be used widely by the agency due to its unique data that is not otherwise available. Data has been used to inform mandatory fisheries reviews such as the 5-year review of the catch shares program it contributed to an amendment to Yelloweye ACL in June 2018 by the Pacific Fisheries Management Council (PFMC and NMFS 2017). This research has contributed to a broad foundation of knowledge of ‘greying of the fleet’, contributing to both national and local management actions, identified women’s roles in fisheries and how that is changing as fisheries are changing, is working to better inform our knowledge about crew and the changing dynamics surrounding crew, and is currently informing ongoing research into infrastructure and consolidation on the West Coast (Cramer et al 2018, Russell et al 2014, PFMC and NMFS 2017, Steiner et al 2018, Vizek et al 2020). Data from this study provides continuous inputs into various studies well beyond its target, as the data represents fishermen who participate in diverse fisheries and can be sorted to understand their diverse roles in those fisheries. Data can be sorted by state and community to understand trends in management and fisheries in those specific places and how they are different and may require different resources to succeed. Data from this research is also compared to other data on the East Coast to determine trends and challenges that may require larger efforts to understand, and resolve. This is a unique data collection that consistently contributes to science well beyond its initial goals.


All data represented includes response rates and is transparent and context is provided. Methodologies are included and references to complete methodological explanations in documents such as the 5-year review and technical memorandum are provided. In these documents, no character limits are applied, therefore full explanations of sampling frames, response rates, return respondents, and non-response are discussed and available.

  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


This is a renewal of a well-established survey and questionnaire. Upon initial development testing was extensive and thorough. As previously described, a full review of the study description, the study methodology, and the survey instrument has been undertaken. NMFS personnel (Todd Lee, Carl Lian, Mark Plummer, Dan Holland, Karma Norman, Steve Freese, Frank Lockhart, Aja Szumylo, and Abigail Harley), Pacific Fisheries Management Council personnel (Jim Seger, Kit Dahl, and Jennifer Gilden), and other federal personnel in various regions have reviewed the survey tool and provided comments on both the survey tool and the study. As previously discussed in Question 3, key industry members were provided a description of the research, discussed the research with the principal investigator, and reviewed the survey tool in a pilot study. Communication with reviewers is being maintained to 1) communicate changes to the survey tool as a result of the reviews, and 2) to lay the framework for the deployment of researchers into the field to conduct the research.


Information received from industry members and other NMFS personnel was found to be invaluable to the development and maintenance of the survey tool. As a result, updates of the survey tool were made to improve the tool. Their continued participation in this research is expected to contribute greatly to its success. Such improvements as such as language/appropriate terminology, and specific clarifications have continued to be included in the survey. These additions have improved the quality of the survey and reduced the burden to take the survey, as it has streamlined the survey. No changes have been made to this current version of the survey. It is consistent with the prior authorized version of the survey.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


As this is a renewal, some of the individuals who played a role in the long term review of this study, have since retired. The internal NMFS design, development, and review team including statistical analysis included Mark Plummer (original survey review -deceased), Dan Holland (206) 302-1752, Dr. Karma Norman, social scientist NWFSC (206) 302-2418;, Todd Lee, Economist NWFSC (original survey review - retired), Carl Lian, Economist, NWFSC (original survey review - deceased), Steve Freese (original and second survey review - retired), Frank Lockhart (206) 526-6142, Aja Szumylo (206) 526-4746, Jim Seger PFMC (503) 820-2416, Jennifer Gilden PFMC (original survey review -departed).


The primary individuals expected to collect the data include Suzanne Russell, social scientist, principal investigator, NWFSC, and others to be identified. The team has typically included three researchers in California, one to two in Oregon, and 3 to 4 based out of Washington but travel to all locations needed to collect data. Individuals who are expected to analyze the data include Suzanne Russell (206) 860-3274 and possible others to be identified.



Resources


Acheson, James M. 2001. Confounding the Goals of Management: Response of the Maine Lobster

industry to a trap limit. North American Journal of Fisheries Management. 21, 414-416.

Anseel, Frederik; Lievens, Filip; Schollaert, Eveline; and Choragwicka, Beata. 2010. Response rates in

organizational science 1995-2008: A meta-analytic review and guidelines for survey researchers.

Journal of Business and Psychology. 25(3), 335-349.

Baruch, Yehuda. 1999. Response Rate in Academic Studies – A comparative Analysis. Human

Relations. 52(4).

Carothers, Courtney. 2013. "A survey of US halibut IFQ holders: Market participation, attitudes, and impacts." Marine Policy 38:515-522. doi: 10.1016/j.marpol.2012.08.007.

Carothers, Courtney. 2015. "Fisheries privatization, social transitions, and well-being in Kodiak, Alaska." Marine Policy 61:9. doi: http://dx.doi.org/10.1016/j.marpol.2014.11.019.

Coleman, Jesse; Carothers, Courtney; Donkersloot, Rachel; Cullenbert, Paula; and Bateman, Alexandra.

2019. Maritime Studies. 18, 47-63.

Cramer, L.A., Flathers; D.;Caracciolo, D; Russell, S; Conway, F. 2018. Graying of the Fleet:

Perceived Impacts on Coastal Resilience and Local Policy. Marine Policy, 96.27-35.

Dewees, Christopher M., 1998. Effects of Individual Quota Systems on New Zealand and British

Columbia Fisheries. Ecological Applications 8(1) Supplement 1998, 133-138.

Himes-Cornell, A.; S. Kasperski; K.Kent; C. Maguire; M. Downs; S. Weidlich; and S. Russell. 2015.

Social baseline of the Gulf of Alaska groundfish trawl fishery: Results of the 2014 social survey.

U.S. Dep. Commer., NOAA Tech. Memo. NMFS-AFSC-306, 98p. plus Appendices.

Macinko, Seth. 1997. Property rights and social transformation in the halibut and sablefish fisheries off

Alaska. In Social Implications of Quota Systems. Eds. Gisli Palsson, Gudrun Petursdottir. Nordic Council of Ministers, Copenhagen.

McCay, Bonnie. 1995. "Social and Ecological Implications of ITQ's: An Overview." Ocean & Coastal Management 28 (1-3):3-22.

McNemar, Q

1949 Phychological Statistics. New York: John Wiley and Sons.

Olson, J. 2011. "Understanding and contextualizing social impacts from the privatization of fisheries: An overview." Ocean & Coastal Management 54 (5):353-363. doi: DOI 10.1016/j.ocecoaman.2011.02.002.

PFMC (Pacific Fishery Management Council) and NMFS (National Marine Fisheries Service).

  1. Rationalization of the Pacific Coast Groundfish Limited Entry Trawl Fishery; Draft Environmental Impact Statement Including Regulatory Review and Initial Regulatory Flexibility Analysis. Pacific Fishery Management Council, Portland, Or. November 2009. Accessed January 6, 2010. http://www.pcouncil.org/wp-content/uploads/0911_TRatEIS_Cover.pdf

PFMC (Pacific Fishery Management Council) and NMFS (National Marine Fisheries Service). 2017.

West Coast Groundifsh Trawl Catch Share Program: Five-year review. Approved by the Pacific Fishery Management Council November 16th 2017, Costa Mesa, CA.

Pollnac, Richard and John J. Poggie. 1988. The Structure of Job Satisfaction amoung New England

fishermen and its application to fisheries management policy. American Anthropologist. 90(4), 888-901.

Rea, Louis M., and Richard A. Parker

1997 Designing and Conducting Survey Research: A Comprehensive Guide. San Francisco, CA: Jossey-Bass.

Robson, Colin

2002 Real World Research. Malden, MA: Blackwell Publishing.

Russell, S. and M.S. Ruff. 2014. the U.S. whale eatching industry of Greater Puget Sound: A description and baseline analysis. U.S. Dept. Commer., NOAA Tech. Memo. NMFS-NWFSC-126.

Russell, Suzanne M., Kimberly Sparks, Albert Arias-Arthur, and Anna Varney. 2014. The Pacific Groundfish Fishery Social Study: An Intial Theme Based Report. Seattle, WA: National Marine Fisheries Service.

Russell, Suzanne, Max Van Oostenburg, and Ashley Vizek. 2018. "Adapting to Catch Shares: Perspectives of West Coast trawl participants. ." Coastal Management 45 (6), 603-620.

Sheldon, M R, M J Fillyaw, and W D Thomson. 1996. "The use and intepretation of teh Fiedman test in teh analysis of ordinal-scale data in repeated measures designs." Physiotherapy Research International 1 (4):221-228.

Sivo, Stephen A., Saunders, Carol, Chang, Quing, and Jian, James J. 2006. How Low Should You Go? Low Response Rates and the Validity of Inference in IS Questionnaire Research. Journal of the Association of Information Systems. 7(1).

Steiner, Erin; Russell, S; Vizek, A; Pfeiffer, L; Warlick, A. 2018. Crew in the West Coast Groundifsh Catch share Program: Changes in Compensation and Job Satisfaction. Coastal Management. 46(6) 656-676.

Vizek, A; Van Oostenburg, M; S Russell. 2020. The Transition to Catch Shares Management in the

West Coast Groundfish Trawl Fishery: Changing Job Attitudes and Adjusting Fishing

Participation Plans. Society and Natural Resources. 33(10) 1175-1193.

1 As these individuals could not be reached, it is unclear whether they can be considered to have exited the fishery.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy