CIBW SS New 041613 Part A revised

CIBW SS New 041613 Part A revised.doc

Cook Inlet Beluga Whale (CIBW) Economic Survey

OMB: 0648-0668

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

Cook inlet beluga whale Economic Survey

OMB CONTROL NO. 0648-xxxx



  1. JUSTIFICATION


This is a request for a new information collection.


1. Explain the circumstances that make the collection of information necessary.


The population of Cook Inlet beluga whales (Delphinapterus leucas), found in the Cook Inlet of Alaska, is one of five distinct population segments (DPSs) in United States (U.S.) waters. It was listed as endangered under the Endangered Species Act (ESA) on October 22, 2008

(73 FR 62919). The population was previously designated as a depleted species under the Marine Mammal Protection Act (MMPA) of 1972 (16 U.S.C. 1362) on May 31, 2000

(65 FR 34590).


The National Marine Fisheries Service (NMFS) is the primary agency responsible for the protection of marine mammals in the U.S., including the Cook Inlet beluga whale (CIBW). Under the terms of the ESA, NMFS initiated the process of developing a recovery plan for the CIBW in March 2010 and published the final rule designating critical habitat for CIBW on April 11, 2011 (76 FR 20180). To aid in the process of plan development, NMFS appointed a Recovery Team composed of two voluntary advisory groups: a Scientific Panel and a Stakeholder Panel. Additionally, under terms of Section 6 of the ESA and the limited cooperative agreement with the State of Alaska signed on December 3, 2009, NMFS coordinates management activities for protected species with the Alaska Department of Fish and Game (ADF&G).


While a number of actions to halt the decline of the CIBW have been implemented since 2000 by NMFS, in cooperation with ADF&G, Alaska Native tribal governments, and other state and local agencies, recovery planning and management is expected to be ongoing for the foreseeable future. Over the course of this process, multiple management actions may be considered by NMFS, the Recovery Team, and cooperating agencies in their efforts to protect and aid in the recovery of the CIBW population. In deciding between management actions, planners and policy makers must balance the ESA and MMPA goals of protecting CIBWs from further declines with economic activities and development in the Cook Inlet region. Cook Inlet beluga whale protection actions may be subject to Executive Order 12866 (58 FR 51735), which requires regulatory agencies to consider costs and benefits in deciding among alternative management actions.


The public benefits associated with actions to protect the CIBW population that may help the species recover are primarily the result of the non-consumptive value people attribute to such protection (e.g., active use values associated with being able to view beluga whales and passive use values unrelated to direct human use). Little is known about these values, yet such information is needed for decision makers to more fully understand the trade-offs involved in choosing among potential protection alternatives and to complement other information available about the costs, benefits, and impacts of protection alternatives. A survey is needed that will collect information that provides the information necessary to estimate public values for protection of CIBWs and the impacts of that protection.


This data collection and the subsequent research and analysis will provide the information needed to allow for a fuller range of benefits to be considered along with cost estimates, as well as other non-economic information, in the analysis of management actions that affect the CIBW. Previously, a pilot version of the survey was administered (OMB Control No. 0648-0621) to a small sample of households to evaluate the survey instrument and administration protocols. In particular, the pretest gathered a sufficient number of responses to evaluate the information presentation, reliability, internal consistency, response variability, and other properties of a newly developed survey, but too few to estimate economic values of interest in the full data collection.


2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with all applicable Information Quality Guidelines.


The information will be collected through a mail survey on a sample of Alaska households. Sampled households will be contacted by NMFS through a mixture of mailings and telephone calls designed to maximize survey response. The mailings include an advance letter describing the study and requesting the individual’s participation and are followed by the survey questionnaire, a reminder postcard, and a second full mailing. In addition to the mail contacts, a telephone contact with non-responding households for whom we have telephone numbers will be attempted. The primary purpose of the telephone contact is to encourage response to the mail survey. For non-responding households to the first five contacts (advance letter, initial mailing, postcard reminder, telephone reminder, and second full mailing), a short non-response survey will be sent by certified mail that collects information necessary to evaluate non-response behavior.


Mail Questionnaire


Survey responses gathered from the questionnaire include information about the following:


  1. Public preferences regarding the protection of CIBWs.

  2. Factors such as the risk of extinction to the DPS, listing status, and protection costs that affect the public’s preferences for protecting CIBWs.

  3. Information on general familiarity, attitudes, and preferences regarding protection of threatened and endangered species, and other priorities for government action.


The data will be used by NMFS to estimate a preference function for explaining choices between protection programs that differ in the extinction risk levels, ESA listing status, and costs. This estimated function will provide NMFS with information on public preferences and values for alternative CIBW protection programs, and what factors affect these values. This information can then be compared with program costs and other impacts when evaluating protection alternatives.


The following is a discussion of how particular questions in the mail questionnaire will be ultimately used. Generally, the survey asks respondents for information regarding their knowledge and opinions of CIBWs, other endangered species, other whale species, and potential goals and impacts of management options available to protect the endangered population of CIBWs, in addition to standard socio-demographic information needed to classify respondents. It is divided into several sections.


Section 1: The Issue: Endangered Cook Inlet Beluga Whales


Prior to the first section, respondents are asked a general social issues question. To put the issue of protecting threatened and endangered species in the context of the broad variety of priorities for government action (each with costs), and thus to reduce survey “importance bias”, Q1 asks the respondent whether less, about the same, or more should be done with respect to several other issues facing the U.S. In addition to protection of threatened and endangered species, the set of issues listed includes government efficiency, education, road and highway improvements, economic growth and jobs, and air and water pollution.


The first section identifies the CIBW as a species protected under the Endangered Species Act and presents information about the Endangered Species Act (ESA), including definitions for “endangered” and “threatened” species, which are important to the policy questions in the survey. Since the CIBW is protected under the ESA as a distinct population segment (DPS), not as a distinct species, respondents are informed that the ESA also may protect a DPS. The introductory material also presents a breakdown of how many species are protected under the ESA to help place CIBWs in context as one of many ESA-protected species. Finally, the introduction identifies that the ESA requires reasonable actions be taken, which begins to motivate the questions about alternative actions to consider. The section also lists reasons people may care about threatened and endangered species and the types of costs that result from protecting them.


  • Q2 asks how positive or negative the respondent’s reaction is when they think about the Endangered Species Act. This simple question identifies people’s general feelings toward endangered species protection. The question provides an easy start to the process of thinking about preferences regarding threatened and endangered species, and, in combination with Question 1, sets a tone of neutrality by accommodating positive and negative reactions at the start of the survey. In initial testing (and a past study), responses to this question were good predictors of how respondents would answer the stated preference questions.


  • Q3 asks respondents whether they are aware that the ESA protects distinct population segments in addition to entire species. This question is used as a way to encourage respondents to read and understand the information regarding the ESA and its protection of DPSs in addition to entire species.


  • After providing some general reasons for and against protecting threatened and endangered species (again providing a neutral perspective), Q4 addresses the importance to the respondent of general protection of threatened and endangered species, and whether protecting jobs is more or less important than threatened and endangered species protection to the respondent. Responses to this question were also found to be correlated with response patterns to stated choice questions in initial qualitative testing (i.e., focus group).


To properly elicit preferences regarding added protection of CIBWs, it is necessary to accurately define the good to be valued, and to provide the context within which it is produced, to ensure that respondents fully understand what they are being asked to value. Part of the process of providing context for the valuation involves discussing the species that may serve as substitutes in individual’s minds for CIBWs. In focus groups, a natural set of substitutes that people identified for CIBWs is other whale species. This section provides a graphic of endangered whales residing in U.S. waters, with some information about whether the entire species or only one or more DPSs are protected. This graphic is useful for illustrating that the CIBW is one of several whale species in the U.S. that are protected by the ESA.


  • Q5 is used to determine whether respondents have had prior experience observing whales, and aids in encouraging respondents to review the information provided.


Section 2: Some Beluga Whale Facts


This brief section introduces several facts about beluga whales generally.


  • Like Q5, Q6 is intended to get respondents to begin thinking about beluga whales and assess their familiarity with beluga whales prior to reading the survey.


Section 3: Beluga Whales in the U.S.


This brief section provides a map of Alaska and table describing where the five beluga whale DPSs are, what their population sizes are, and what the population trend is for each.


  • Q7 is another question intended to put the issue of CIBWs in a larger context (all beluga whales) and asks respondents whether they are concerned about the DPSs that are declining given that other DPSs are stable or increasing.


Section 4: Cook Inlet Beluga Whales


This section describes how the CIBW DPS is different from the other DPSs, where it is located, its ESA listing, natural and human-related factors associated with the past population decline, its current population trend, past and present efforts to protect it, and economic activities in the Cook Inlet that may be affected by protection measures, as well as the current estimated risk of extinction for the DPS under current conditions. This and the next section define the baseline of current and expected future conditions with current management programs, which is required for proper valuation of alternative levels of protection.


  • Q8 asks whether the respondent has ever seen, heard, or read about the CIBW before reading the survey and is intended to get the individual thinking about the species and what they know about it.


  • Respondents are asked how concerned they are about the CIBW in Q9. This information serves dual purposes. First, this question encourages the respondent to read and understand what is occurring with the DPS, and second, provides information that can be used to check for consistency of preferences with responses to stated preference questions.


  • Q10 asks specifically about the risk of extinction information discussed above the question. It is intended to encourage the respondent to read the information on extinction risk carefully and consider whether the estimate is concerning from the respondent’s perspective.


Section 5: New Cook Inlet Beluga Whale Protection Actions


This section introduces the concept of additional protection actions for CIBWs being undertaken and sets the stage for asking about protection alternatives and specific outcomes in the stated preference questions. In this section, different types of protection actions that would help CIBWs to recover are described in general terms, the term “recover” is defined, and the costs of additional protection actions (payment vehicle) are discussed in terms of the effects they would have on individual households.


  • Q11 asks respondents to what extent they agree with two statements, one indicating a desire to help the CIBW recover, even if it costs more money; and the other stating that the most effective protection actions should be used even if businesses and individuals are negatively affected. The question serves the purpose of acknowledging that there are costs to protecting CIBWs and informing the respondent about these costs. This is important for maintaining a neutral stance regarding protection and minimizing information bias. Additionally, agreeing with the first statement indicates a willingness to spend money to protect the DPS, while disagreement suggests individuals may not choose costly programs to help the DPS. Disagreement with the second statement provides a reason why individuals may not be willing to spend additional money to protect CIBWs.


Section 6: What Alternatives Do You Prefer?


This section contains the stated preference questions, which are in a choice experiment, or stated choice, framework. The section begins with instructions for answering the questions and a budget reminder. In addition, a “cheap talk” script (e.g., Cummings and Taylor [1999]) is included to minimize potential hypothetical bias. Cheap talk refers to introductory text provided to the respondent before the stated preference questions are asked that explains what hypothetical bias (i.e., the potential bias associated with the respondent not being compelled to pay the amount they state they would pay) is, why it is problematic, and an appeal to not introduce this potential bias by answering the questions as truthfully as possible. The instructions and cheap talk script are followed by four stated choice questions (Q12, Q13, Q14, and Q15) and follow-up questions (Q16, Q17). The information from these questions will be used to estimate a CIBW protection preference function.


  • In each of the four choice questions (Q12 through Q15), respondents are confronted with three alternatives that differ in what they do and how much they cost: the current CIBW protection program (Alternative A), which is the status quo alternative, and two others that do more and cost more (Alternatives B and C), in the survey to encourage respondents to view the non-status quo alternatives as distinct across choice questions. These alternatives are described by their expected results with respect to the following attributes:


  1. Population status in 50 years

  2. Risk of extinction by the year 2110

  3. Added household cost1


Respondents are then asked to choose the alternative they most prefer, and which they least prefer. The status quo is always the first option to make it easy for respondents to select it (and reduce any unintended bias in selecting alternatives to do more and spend more), and to allow rank ordering of non-status quo alternatives relative to the baseline (Alternative A), which provides statistical efficiency gains over paired choices.


  • In Q16, respondents are asked to agree or disagree with several statements that are used to help address several concerns about people’s responses, including whether respondents feel it is their responsibility to pay for CIBW protection at all (potential protest), whether respondents had enough information to make an informed choice (the effect of uncertainty on values), whether respondents were paying just for CIBWs or if they believed other species were being protected by the alternatives considered (potential embedding), whether respondents believed the federal government could effectively manage the CIBW protection programs to bring about the results being valued (potential protest), whether respondents feel they should not have to pay more federal taxes for any reason (potential protest),whether the scientific estimates of future extinction risk were believable to the respondent (potential protest), a statement about whether the respondent felt qualified to choose between different extinction risks (potential protest), and a statement indicating an unwillingness to pay if there is any risk of extinction.


  • Q17 identifies how confident individuals are about their answers to the stated preference questions. Respondents stating they are “not at all confident” in their answers may be excluded from the estimation since these individuals, for whatever reason, are uncertain that their answers reflect how they feel.


  • The final question (Q18) in the section is intended to gauge respondents’ general environmental attitudes using questions from the New Ecological Paradigm, a series of Likert scale questions that measure pro-environmental sentiments on several dimensions (Dunlap, van Liere, Mertig, and Jones, 2001). These questions have been used frequently in numerous environmental surveys. An understanding of general environmental attitudes may be helpful to explain responses to stated preference questions and enable classification of respondents.


Section 7: About You and Your Household


This final section consists of eleven questions, Q19 through Q29, that collect information about the respondent and the respondent’s household to be used as explanatory variables in the stated preference model, for comparing the sample to the population (coverage or sampling bias), and for comparing respondents to non-respondents (non-response bias). To the extent possible, the questions and response categories parallel those used by the Census Bureau to allow the most direct comparisons.


  • Socioeconomic, demographic, and classification information collected includes gender (Q19), age (Q20), household size (Q21), employment status (Q22), membership in an environmental or conservation program (Q23), recent fishing and hunting behavior (Q24), educational attainment (Q25), household ownership status (Q26), ethnicity (Q27), race (Q28), and income (Q29).


Telephone Follow-Up


Following the initial mailing and postcard reminder, we will contact non-respondents by telephone to encourage them to complete the mail survey.2 No additional information will be collected from these individuals, as this telephone call will be used solely to encourage individuals to respond to the mail survey.


Non-Respondent Survey


After the telephone contact and second full mailing, individuals who have still not responded will receive a non-respondent survey. This short, 2-page survey will be mailed to respondents by certified mail. The non-respondent survey includes selected socioeconomic and demographic questions, along with two key attitudinal questions and a question that directly asks them for the reasons they may not have participated in the main mail questionnaire. Information about these variables will enable conducting statistical tests to determine whether non-respondents differ from respondents with respect to these characteristics. The attitudinal questions include versions of Q2 and Q4 from the mail questionnaire. Responses to questions like these have been shown to be correlated to responses to stated preference questions in earlier rounds of focus groups and cognitive interviews and in the formal pretest. This information can be used to evaluate and adjust the results for potential non-response bias among sample members.


The National Oceanic and Atmospheric Administration (NOAA) will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Although the information collected is not expected to be disseminated directly to the public, results may be used in scientific, management, technical or general informational publications. Should NOAA decide to disseminate the information, it will be subject to the quality control measures and pre-dissemination review pursuant to Section 515 of Public Law 106-554.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology.


The survey will be administered as a mail survey and therefore does not involve the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology.


4. Describe efforts to identify duplication.


The economics literature was consulted extensively to identify studies that valued Cook Inlet beluga whales. To date, there has not been any study that provides economic value information for CIBWs.3 However, a recent unpublished government study by Olar, et al. (2007) valued the protection of beluga whales in the St. Lawrence Estuary in Canada, which is classified as threatened under the Species at Risk Act (SARA) in Canada. The study uses stated preference choice experiment data collected from a survey of Canadian households using an Internet-enabled Web panel that achieved a cooperation rate of 52%. Mean household willingness to pay for improving the St. Lawrence Estuary beluga whale from its currently threatened status to a special concern status was estimated to be $107 (Canadian dollars), with a standard deviation of about $12. For a larger improvement, from threatened to not at risk, the mean household willingness to pay (WTP) was estimated to be $122 (Canadian dollars) with a standard deviation of about $17. While these results suggest a positive WTP for improving the status of beluga whales in the St. Lawrence Estuary, the WTP is for Canadian households and does not speak to Alaska households’ preferences and values.


Although there are no existing survey efforts to understand the public’s preferences and values for protecting CIBWs, there are numerous examples of studies conducted to estimate the non-consumptive value of other endangered species and marine mammals. Examples include Bosetti and Pearce (2003), Langford, et al. (2001), Jakobsson and Dragun (2001), Fredman (1995), Hagen, et al. (1992), among others. All these studies utilized contingent valuation methods, as do the vast majority of species valuation studies.4 As a result, they are unable to fully analyze marginal values of attributes of the species protection. The proposed study departs from most of the existing literature in its use of a stated choice framework that allows marginal values of attributes of protection programs to be estimated. The added information provided by this approach arms decision makers with better information about how much the public would benefit from programs that lead to differing results, and thus represents a flexible tool for management. A recent study by Lew, Layton, and Rowe (2010) illustrates an application of this approach with respect to the valuation of protection for a U.S. threatened and endangered species (the Steller sea lion).


5. If the collection of information impacts small businesses or other small identities, describe any methods used to minimize burden.


The collection does not involve small businesses or other small identities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently.


The survey is necessary to gather data for estimating public values for additional protection for CIBWs, beyond what is currently being done. If the data collection is not conducted, NMFS will have to rely on information about public values for other species to infer the value of protecting CIBWs using benefits transfer methods to consider along with other important information in decisions about CIBW management alternatives.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with OMB guidelines.


The collection is consistent with OMB guidelines.


8. Provide information on the PRA Federal Register Notice that solicited public comments on the information collection prior to this submission. Summarize the public comments received in response to that notice and describe the actions taken by the agency in response to those comments. Describe the efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


A Federal Register Notice published February 7, 2012, (77 FR 6065) solicited comments on the information collection. The information collection described therein was for a survey with a larger scope; the target population has since been refined to Alaska households.


There were three requests for copies of the survey and five letters submitted providing substantive comments. One additional comment was a general statement about there not being a need to spend government money on surveys of this type. No response was prepared for this comment. Copies of the survey instrument were provided to the requesting individuals.


All five letters with substantive comments focused on the first of four topics on which the FR Notice invited comments: “whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information shall have practical utility.” Although the comments did not address the other three topics, they did provide multiple substantive points of criticism. On review of the submitted letters, NMFS staff identified eight unique comments and drafted detailed responses to each. Comments and responses are presented in an Appendix, submitted as a supplementary document.


All of these commenters recommend cancellation of the proposed data collection, and two suggest that proceeding with the information collection prior to completion of the CIBW Recovery Plan is inappropriate. For reasons described in the responses to comments, NMFS disagrees with the claims that the information collection is unnecessary for the proper performance of the agency’s function, and that it is appropriate to delay the research pending completion of Recovery Plan development. NMFS intends to proceed with the data collection, pending clearance under PRA.


Four of the comment letters were submitted by members of the CIBW Recovery Team Stakeholder Panel. All expressed concerns about the lack of coordination with the Recovery Team and other government entities on development of the proposed data collection, and interpreted the Federal Register (FR) notice as a statement of NMFS’ intent to pursue CIBW protection actions outside of the framework of the established recovery plan process and cooperative agreement with the State of Alaska Department of Fish and Game. In response, senior Protected Resources staff at NMFS Alaska Region communicated in writing to Recovery Team members to clarify the objectives and intended use of survey results in the recovery plan process and subsequent economic analysis of options developed by the Recovery Plan team. The letter also states NMFS’ intent to better inform team members in the future about progress and findings related to this project. In addition, a response was sent by the principal investigators to the commenter referencing the letter to the Recovery Team and further emphasizing the survey objectives and intent to improve consultation with the Recovery Team members. The FR notice text for publication upon submission of this PRA clearance request to OMB was revised to clarify the objectives of the research.


The survey instrument presents the latest information on CIBWs, current population trends, alternative management options, and likely impacts of management options. To ensure that the information is as accurate as possible, numerous CIBW researchers and biologists have reviewed the survey instrument, including Mr. Jon Kurland (Director of NMFS Alaska Regional Office’s Habitat Conservation Division), Dr. Brad Smith, Dr. Barbara Mahoney, Dr. Kaja Brix, and Dr. Lew Queirolo of the NMFS Alaska Regional Office, and Dr. Kim Shelden and Dr. Rod Hobbs of NMFS’ National Marine Mammal Laboratory.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Inclusion of an incentive acts as a sign of good will on the part of the study sponsors and encourages reciprocity of that goodwill by the respondent. Singer (2002) provides a comprehensive review of the use of incentives in surveys. She notes that giving respondents a small financial incentive (even a token amount) in the first mailing increases response rates in mail-based surveys and are cost-effective. Such prepaid incentives are more effective than larger promised incentives that are contingent on completion of the questionnaire. In tests conducted by Lesser, et al (1999), including a $2 incentive in a mailing with four contact points was shown to increase response rates by an additional 19 to 31 percentage points. Thus, even a small upfront incentive typically is more cost effective than additional follow-up steps that are often considered.


To encourage participation in the mail survey, a $5 honorarium will be provided to the participants in the initial mailing. During the pilot pretest implementation of this survey (OMB Control No. 0648-0621), we conducted a split-sample test of different amounts of honorarium--$1, $5, and $10. A memorandum to OMB following completion of the pilot pretest implementation (sent to OMB on September 20, 2011) reported that the $5 and $10 incentives resulted in a statistically higher response rate than the $1 incentive. The response rates associated with the $5 and $10 incentive amounts were not statistically different. As a result, the $5 honorarium appears to be the least costly incentive to increase response rates significantly.


There are several reasons why we believe inclusion of both a financial incentive and follow-up contacts will be needed to reach desired response rates. First, the survey is about an unfamiliar issue to many Alaskans. As such, the chance that respondents will not be motivated to complete the survey is higher than for a survey on a more familiar subject. Second, although every attempt is being made to ensure the survey is easy to read, understand, and complete, the amount of information it needs to present and the number of questions it needs to ask contribute to a 16-page survey requiring more respondent attention than some surveys. For these reasons, we expect both incentives and follow-up contacts will be required to obtain a suitable response rate.


10. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation, or agency policy.


In the cover letter accompanying each mailing, respondents will be told that their name and address information will be kept separate from their responses and that only their responses will be given to researchers. The cover page of the survey also includes the following statement:


Your name and address will be kept separate from your responses. Only your responses will be provided to researchers for analysis.’


Following completion of the data collection, the survey firm will delete any information identifying individuals (i.e., name and addresses) before any data file is delivered to NMFS or any other participating researchers and agencies.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.


There are no questions of a sensitive nature asked in the survey.


12. Provide estimates of the hour burden of the collection of information.


The mail survey will be sent to a random sample of approximately 4,200 addresses. The random sample will be purchased from a professional sampling vendor. Based on previous experience, up to 15% of these types of samples can be expected to be bad or unusable addresses, which means the number of households receiving the survey will be approximately 3,570. We expect a final response rate of at least 50 percent (of the valid sample), leading to at least 1,785 (= 3,570 0.50) responding households returning completed surveys. The cover letter will solicit the participation of an adult head of the household to complete the survey. Our experience suggests respondents typically complete the survey in 20 to 25 minutes, so we assume 25 minutes in our computation of the potential burden hours. As a result, those ultimately completing the survey are expected to contribute up to 744 hours to the overall hour burden.


Following the initial mailing and postcard, we expect approximately 70% of expected completes or 1,250 households to have returned completed surveys (based on results from the pilot pretest survey implementation). Households that have not responded after the initial mailing and postcard reminder will be contacted by telephone and encouraged to complete and return the survey. Households that need a replacement questionnaire will be identified and sent a new one. The phone interview is expected to take 2 minutes on average to complete, and we expect to attempt to reach up to 36% of the 535 potential respondents who will eventually return the survey, or up to 193 individuals, and 36% of the 1,785 of those who will not return the survey, or up to 643 individuals, for a total of 836 individuals representing approximately 28 burden hours (193 2 min + 643 2 min).5


Following the telephone prompts, a second full mailing will be attempted. This will not result in any additional burden hours.


After all contacts, we expect 1,785 responding households to have returned completed surveys, which leaves 1,785 non-responding households. A brief (2-page) non-respondent survey will be conducted with a sample of 750 non-responding individuals. Each non-responding household will be sent the short survey by certified mail. Of these 750 non-responding individuals sent a non-response survey, we anticipate 33%, or 250, will return completed surveys. The non-respondent survey is expected to take at most 5 minutes to complete, which results in an additional 21 hours.6


The total number of unique respondents to all survey contacts will be 2,678. This number consists of respondents who return the questionnaire (1,785), respondents who do not return the questionnaire but are reached during the telephone prompt contact (643), and the non-respondents to the main mail questionnaire who complete the non-response survey only (250).


Survey instrument

Estimated number of respondents

Estimated number of responses

Estimated time per respondent (minutes)

Estimated total annual burden hours

(hours)

Mail survey (from initial mailing and postcard reminder)

1,250

1,250

25

521

Mail survey (returned after phone contact and follow-up full mailing)

535

535

25

223

Follow-up phone call

643a

836 c

2

21

Non-response survey

250

250

5

21

Total respondents

2,478b

2,661


786


a Number of successful phone contacts of households that have not returned completed surveys following initial mailing and postcard reminder.

b Total respondents reflect the total sample size minus the households that do not complete either the mail survey or phone interview.

c Includes 643 households that complete only the phone call, plus 193 households contacted by phone that also complete the mail survey.



13. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection (excluding the value of the burden hours in Question 12 above).


No additional cost burden will be imposed on respondents aside from the burden hours indicated above.


14. Provide estimates of annualized costs to the Federal government.


Annualized cost to the Federal government of the survey is approximately $50,000 per year, divided as follows: $40,000 in contract award money and $10,000 in staff time and resources. Contractor services include administering of the mail survey, follow-up telephone calls, non-response survey, and data validation.


15. Explain the reasons for any program changes or adjustments.


This is a new collection.


16. For collections of information whose results will be published, outline plans for tabulation and publication.


A report will summarize the survey development, testing, and implementation. It will present statistical summaries (i.e., means, variances, and frequency distributions) of data collected in the survey, and some basic analyses of the data. In addition, the econometric analysis of the stated preference choice experiment data will be reported in one or more papers that will be submitted for publication at an environmental economics peer-reviewed journal, such as Marine Resource Economics, Journal of Environmental Economics and Management, or Land Economics. It is also expected that the information produced from the econometric analysis of survey data may be used in regulatory analyses of Recovery Plan alternatives.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


Not Applicable.


18. Explain each exception to the certification statement.


Not Applicable.


1 In cognitive interviews, individuals were specifically asked in what form they believed they would be paying for Cook Inlet beluga whale protection programs. The vast majority responded that the added cost in the choice questions simply represents money out of their pocket, mostly in the form of federal taxes, but also from some additional expenditures on seafood products.

2 Those needing a replacement survey will be mailed one following the telephone interview.

3 As noted above, the CIBW pilot pretest survey did not collect sufficient data for estimating value information.

4 See Loomis and White (1996) and Richardson and Loomis (2009) for summaries of the literature related to the valuation of threatened and endangered species.

5 Although we will attempt to reach all households in the sample that have not returned a completed survey to this point, we do not expect to be able to reach more than 193 in a timely and affordable manner.

6 Based on informal testing, we expect the two-page non-response survey will take respondents 3-5 minutes to complete on average, but for the purposes of calculating burden hours, we assume 5 minutes.

13


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorDan Lew
Last Modified ByDan.Lew
File Modified2013-04-16
File Created2013-04-16

© 2024 OMB.report | Privacy Policy