SUPPORTING STATEMENT
PRELIMINARY CASE STUDY ASSESSING ECONOMIC BENEFITS OF MARINE DEBRIS REDUCTION
OMB CONTROL NO. 0648-xxxx
A. JUSTIFICATION
1. Explain the circumstances that make the collection of information necessary.
The National Oceanic and Atmospheric Administration (NOAA) is requesting approval for a new information collection to conduct a general population mail survey of households in Orange County, California. The survey will collect data on day trips to local beaches which, when combined with marine debris measurements at these beaches, will be used to assess the degree to which marine debris impacts beach visitation decisions. This issue has not been addressed in the economics literature, and the study will allow NOAA to assess the potential utility of revealed preference models in assessing the economic benefits of reductions in marine debris to beach visitors. Lessons learned through the study will guide future efforts by NOAA to assess the economic benefits associated with reductions in marine debris at beaches.1
The Marine Debris Research, Prevention, and Reduction Act of 2006 ((33 U.S.C. §§ 1951 et seq.), together with the Marine Debris Act Amendments of 2012, established NOAA’s Marine Debris Program (hereafter referred to as “the Program”) to “identify, determine sources of, assess, prevent, reduce, and remove marine debris and address the adverse impacts of marine debris on the economy of the United States, the marine environment, and navigation safety.” Marine debris is defined as “Any persistent solid material that is manufactured or processed and directly or indirectly, intentionally or unintentionally, disposed of or abandoned into the marine environment or the Great Lakes.” The Act directs the Program to “undertake outreach and education activities for the public and other stakeholders on sources of marine debris…and its adverse impacts on the United States economy…” The Act also directs the Program to “estimate the potential impacts of a severe marine debris event, including economic impacts on…tourism.”
The Program requires information on the impact of marine debris on beach visitors to adequately address the requirements of the Marine Debris Act that are related to the economy and tourism and to assess the benefit of restoration projects related to marine debris removal within the context of natural resource damage assessments conducted by NOAA under the Oil Pollution Act (33 U.S.C. §§ 2701 et seq.).
The economic benefits of reductions in marine debris accrue to multiple parties, including those who actively use and derive direct benefit from beaches and those members of the general population who simply care about clean beaches whether they use the beach or not. Potential direct beneficiaries of reductions in marine debris—those who receive values from the direct use of the resource—include beach visitors and local property owners. Passive use value (also known as non-use or existence value) “is the willingness to pay for the preservation or improvement of natural resources, without any prospect or intention of direct or in-situ use of the resource” (Haab and McConnell 2002, p. 16). Because use and non-use values accrue to different segments of the population, different methods are necessary to evaluate changes in use and non-use values due to changes in marine debris. Further, because use values accrue to different user populations (for example, beach visitors and property owners), different studies and methods are necessary to estimate use values for each sub-population.
The current study uses a revealed preference valuation method to estimate the changes in use values that accrue to beach visitors in Orange County, California due to changes in marine debris. The study does not estimate non-use values and it does not use stated preference methods.
The proposed information collection will allow NOAA to implement a preliminary study focused on the impact of marine debris on beach visitors in Orange County, California. The study results will be used to develop and refine NOAA’s future efforts to assess the overall economic impacts of marine debris. These future efforts may include revealed preference valuation studies focused on beach visitation in other coastal areas of the United States.
2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with all applicable Information Quality Guidelines.
Overview
The survey data will be used by the Marine Debris Program in a study designed to estimate the economic benefits of reductions in marine debris to Orange County beach visitors. Data on survey respondents’ trips to local beaches will be combined with data on travel costs and beach characteristics to estimate the parameters of a random utility maximization (RUM) travel cost model. The beach characteristics data will be gathered outside of the survey effort and will include on-site, quantitative measurements of marine debris. RUM travel cost models are a type of discrete choice model frequently used by economists to describe recreation site choice decisions and to assess the economic gains (losses) associated with improvements (declines) in the quality of specific recreation sites (see Haab and McConnell 2002).2 The RUM model will be used to estimate welfare gains to beach visitors associated with reductions in marine debris concentrations. Marine debris can lead to welfare losses for beach visitors by diminishing the quality of their visits to the beach, by causing them to travel to less desirable alternative beaches, or by causing them to pursue alternative activities.
The RUM travel cost model is a revealed preference valuation approach, relying on individuals’ actual behavior (i.e., selecting beaches to visit and incurring the associated travel costs) to make inferences about willingness to pay (WTP). Individuals may be willing to pay for reductions in marine debris on beaches that they visit for a variety of reasons, including aesthetic concerns and concerns about potential health effects. The RUM model does not distinguish among these various motivations; it simply uses data on beach choices to estimate WTP for reductions in marine debris at beaches.
Background
Marine debris is widely acknowledged to be a persistent problem in many coastal areas of the United States. There are a variety of potential economic losses associated with marine debris including, costs incurred by local governments and volunteer organizations to remove and dispose of marine debris, impacts on waterfront property values due to diminished aesthetic appeal, and potential effects on recreational and commercial fisheries.
One of the more significant potential economic losses is that suffered by beach visitors due to the presence of marine debris on beaches. Beach visitors are likely to be concerned about marine debris both because it poses potential physical harm due to lacerations, bacterial infections, or entanglements during swimming, and because it may detract from the perceived natural beauty of an area. In contrast to debris or litter along the roadside or in parks, there is a high potential for dermal contact with marine debris on beaches as visitors frequently go barefoot, lie directly on the sand, and dig in the sand. The existence of numerous volunteer efforts to remove debris from beaches and the fact that many municipalities regularly rake beaches to remove debris is an indication that beach visitors are negatively impacted by the presence of marine debris.
Orange County, California was selected for the study because it has numerous well-defined, popular beaches located very close to a large urban area. As beaches are closely linked to the culture of the local area, visitation rates are likely to be reasonably high, which will facilitate the collection of data on beach trips from the general population. In addition, there appears to be sufficient variation in factors potentially associated with marine debris (e.g., population densities, local land use, frequency of beach cleaning, locations of river mouths, etc.) that it seems reasonable to expect marine debris levels to vary across sites (Moore et al. 2001). The presence of a variety of local beaches provides an opportunity to determine, through statistical modeling of beach choices, whether residents choose to travel farther from their homes or to visit beaches that are less desirable in other respects, in order to recreate at beaches that have lower densities of marine debris.
Details and Purpose of Information Collection
The Orange County study will provide an important contribution to the empirical literature on RUM travel cost models. There are currently no published RUM travel cost models that attempt to link beach trip choices with physical measurements of marine debris at individual beaches. While Parsons et al. (2009) include indicators of beach cleaning activities in a RUM model focused on Texas Gulf Coast beaches, they do not obtain objective measurements of the amount of debris at each beach. Other recent beach recreation models do not address marine debris at all, including the Southern California beach model (Hanemann et al. 2004), a model focused on visits to New Jersey, Delaware, and Maryland beaches (Parsons and Massey 2003), and a model focused on visits to San Diego County beaches (Lew and Larson 2005). Thus, while the literature has demonstrated the importance of beach characteristics such as width, length, and various amenities, the presence of marine debris hasn’t been considered.
The primary research goal of the study is to use revealed preference data on beach trips to quantify the relationship between marine debris (number of items per square meter) and beach visitation choices, while controlling for a variety of factors that may impact these choices, such as travel cost, beach width, beach length, and available amenities. That is, the data will allow us to assess whether or not the quantity of marine debris on a beach has a statistically significant impact on the utility that an individual receives from a beach visit, and it will allow us to estimate the magnitude of that impact.
Within the framework of this primary goal and contingent on finding an impact, we will explore a number of additional research questions, including:
What specific types of marine debris have the greatest impact on beach choices (e.g., plastic, metal, glass etc.)?
Does the impact of marine debris on beach choice vary in a systematic way across respondents (e.g., are visitors with children more sensitive to marine debris levels)?
Does the impact of marine debris on beach choices depend on the location of the debris on the beach (e.g., within the wrack line, on the foreshore, or on the backshore)?
Is there a threshold above which marine debris seems to influence beach choices and below which it does not?
What is the relationship between respondents’ perceptions of marine debris and actual marine debris levels?
These secondary research questions have not been addressed in the economics literature, as researchers have not had access to the type of data that will be collected in this study: data on beach choices combined with detailed data on marine debris.
As discussed in Part B, focus groups recently conducted in Orange County indicate that some residents are concerned about the presence of marine debris on local beaches and would refrain from visiting specific local beaches as a result of these concerns. Furthermore, all participants indicated that they had actually observed marine debris at local beaches. These findings, although derived from a small sample of residents, suggest that a revealed preference model (e.g., a RUM model) may indeed be successful in quantifying the relationship between marine debris and beach visitation choices.
In addition to contributing to the RUM literature, the Orange County study will allow NOAA to assess the potential utility of this type of model in assessing the economic benefits of reductions in marine debris; lessons learned through the study will guide future efforts by NOAA to assess the economic benefits associated with reductions in marine debris at beaches. A specific research agenda for these future efforts has not yet been established, as it is partially contingent on the results of the current study. NOAA does not plan to simply scale up the current study to a larger geographic region. However, because future research efforts may also apply a similar methodology, an assessment of its usefulness is required. In particular, this study will help NOAA determine whether or not individuals living near the coast are generally aware of marine debris levels and whether or not they consider these levels when selecting a beach. These are necessary conditions for the use of RUM travel cost models to assess WTP for reductions in marine debris.
The data collection consists of two surveys: a primary survey and a non-respondent follow-up survey. The primary survey will be implemented by mail in November/December 2013 using address-based sampling (Link et al. 2008; Brick, Williams, and Montaquila 2011; Iannacchione 2011). The primary survey includes questions that focus on beach day trips, beach activities, marine debris at local beaches, and demographic characteristics (see below for a description of each survey question). With regard to beach day trips, the respondent is asked to indicate the specific local beaches that he or she visited over the past three months and the number of day trips taken, by month, to each beach.
The implementation sequence for the mail survey will be as follows:
Day 1: An advance letter will be sent to all sampled households via first class mail. The letter will notify the household that a survey is on the way, describe the purpose of the survey, and encourage the individual to respond.
Day 5: The primary survey will be mailed to all sampled households via first class mail. The survey instrument will include a $2 cash response incentive, a letter, a color map of local beaches, and a self-addressed, stamped envelope.
Day 12: A thank you/reminder postcard will be mailed to all sampled households thanking them for responding and encouraging them to complete the survey if they haven’t already.
Day 26: A replacement survey instrument will be sent to all sampled households who have not yet responded via first class mail. The replacement survey will include a letter, a color map, and a self-addressed, stamped envelope.
The non-respondent follow-up survey will be conducted with a sub-sample of an estimated 600 individuals who do not respond after the first two survey mailings in order to investigate the potential for nonresponse bias. The non-respondent follow-up survey will be sent via two-day FedEx and will consist of a letter, a five-question survey, and a self-addressed, stamped envelope. The nonresponse follow-up survey will be implemented four weeks after the replacement survey has been mailed.
The questions in the non-respondent follow-up survey are a subset of the questions from the main survey. The wording of the questions is identical across the two surveys in order to facilitate comparisons and the potential development of adjustment weights. The first two questions focus on the frequency with which the individual visits beaches in the local area: Question 1 asks about trips over the past 12 months, while Question 2 asks about trips over the past summer (June, July, and August). By comparing responses to these two questions with responses to identical questions in the main survey, we will be able to assess whether respondents take more trips to local beaches than non-respondents.
The second two questions in the non-respondent follow-up survey focus on the extent to which the individual is concerned about marine debris on local beaches: Question 3 asks directly about the individual’s level of concern about marine debris (on a 1 to 5 scale), while Question 4 asks about the individual’s participation in beach cleanup efforts. By comparing responses to these two questions with responses to identical questions in the main survey, we will be able to assess whether respondents are more concerned about marine debris than non-respondents.
If respondents differ from non-respondents in either of these two ways (number of beach trips and level of concern about marine debris), then a RUM model that relies entirely on data from respondents may produce biased estimates of WTP for reductions in marine debris. If substantial differences between respondents and non-respondents are observed, the non-respondent follow-up survey results can be used to develop non-response adjustment classes for re-weighting the respondent data (see, e.g., Lohr 1999 or Groves et al. 2004). The non-respondent follow-up survey furthers the research goals of the current study by assessing the potential for non-response bias so that it may be addressed in future efforts, if necessary.
An alternative approach to assessing potential non-response bias is to incorporate questions in the survey instrument that are identical to questions that have been included in large scale, high quality social science survey efforts (i.e., “benchmarking”). For example, questions about the respondent’s opinion regarding government spending on the environment (“Too little”, “About the right amount,” or “Too much”) have frequently been included in the National Opinion Research Center’s General Social Survey (GSS). When an identical question is added to a survey instrument in a study that focuses on environmental issues, one can compare the responses with the GSS responses to determine whether individuals who are more inclined to favor government spending on environmental issues are also more likely to respond to the survey.
We do not use a benchmarking approach in the current survey effort because the current survey focuses on a relatively small geographic area, Orange County, California. The GSS and other social science surveys do not have large enough sample sizes at the county level for comparisons that would have any reasonable level of statistical power. Furthermore, using state-level data for comparisons would not be informative as a sample of Orange County residents could not be considered to be representative of the state of California.
The content and specific purpose of each question is described below.
Primary Survey:
Question 1 asks if the respondent has ever visited a local beach. This question simply allows respondents to skip the initial questions about beach trips if they do not visit local beaches at all.
Question 2 asks about the number of trips taken to local beaches over the past year. This question provides annual trip data that can potentially be used in transferring benefit estimates for the three-month study period (which will be used in the RUM model) to an entire year.
Question 3 asks about activities that the respondent participates in at local beaches. This question provides information that will be useful in determining the model structure, as respondents’ beach choices may depend on the types of activities they would like to pursue.
Question 4 asks about the typical transportation mode that the respondent uses when visiting local beaches. For respondents using vehicles to access the beach, there is a follow-up question about the number of adults and children typically in the vehicle. These questions will be used in calculating travel costs for the beach choice model.
Questions 5 and 6 ask about beach characteristics that are important to the respondent. This will be useful in determining which beach attributes to include in the choice model and in assessing the importance of marine debris relative to other beach characteristics.
Questions 7 and 8 ask about awareness of local beaches and whether there are certain beaches that the respondent would never consider visiting. These questions will be used in developing the “choice set” for the RUM model, or the set of beaches that the respondent considers visiting.
Questions 9 and 10a ask about the specific destinations of trips taken to local beaches during the previous three months (June, July, and August). This question will provide the beach trip data necessary to estimate the choice model.
Question 10b asks about the amount of marine debris observed at each beach that the respondent visited. This question provides data that will allow for an investigation of the relationship between perceived and actual marine debris concentrations at each beach (with actual concentrations measured on site).
Question 11 asks about the extent to which the respondent is concerned about marine debris in general. Responses to this question could be used to develop models that allow for marine debris preferences to vary across respondents.
Question 12 asks about the extent to which the respondent is concerned about various types of marine debris. This provides information that can be used in constructing the marine debris variable for the choice model from on-site data on marine debris (which will be classified by type).
Questions 13, 14, and 15 ask about the types of marine debris that the respondent has actually seen at local beaches. These questions will help in interpreting responses to Question 12, as respondents’ beach choices are not likely to be influenced by a particular type of marine debris if they never actually observed it at local beaches.
Question 16 asks the respondent to indicate the local beaches where they think marine debris is a problem. This question, like Question 10b, provides data that will allow for an investigation of the relationship between perceived and actual marine debris levels. This question is included in addition to Question 10b because Question 10b focuses only on the beaches that respondents actually visited.
Question 17 asks about the respondents’ perception of the source of marine debris. This will potentially be used in specifying the choice model, as visitors’ may have heterogeneous behavioral responses to marine debris depending on where they believe it originated.
Question 18 asks about the respondents’ participation in beach cleanups. This question is included to assess potential nonresponse bias. An identical question is included on the non-respondent follow-up survey.
Question 19 asks the respondent to indicate the local beaches where they think crowding is a problem. This question will be used to develop measures of crowding that can be used in the site choice model.
Question 20 asks about the number of adults and children in the household. The number of adults will be used (together with household income, Question 26) to determine the approximate wage rate for calculating the opportunity cost of time in the RUM model. In addition, responses will be used in developing weights that account for differing selection probabilities across households due to differences in the number of adults.
Questions 21 to 26 ask for demographic details, including gender, age, race/ethnicity, education, and household income. The responses to these questions will allow us to assess the representativeness of the survey respondents through comparisons with census data for Orange County.
Non-Respondent Follow-Up Survey:
Questions 1 and 2 ask about the number of trips taken to beaches in the local area over the last year and over the last three months. These questions will allow us to assess any potential avidity bias in the primary survey. Responses to Question 1 will be compared with responses to Question 2 from the primary survey, and responses to Question 2 will be compared with responses to Question 10a from the primary survey.
Questions 3 and 4 ask about concern about marine debris (Question 3) and participation in beach cleanups (Question 4). These questions will allow us to assess whether respondents tend to care more about marine debris issues than non-respondents. Responses to Question 3 will be compared with responses to Question 12 on the primary survey, and responses to Question 4 will be compared with responses to Question 18 on the primary survey.
Question 5 asks about the number of adults and children in the household. Responses to this question will be used in developing weights that account for differing selection probabilities across households due to differences in the number of adults.
The NOAA Marine Debris Program will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Although the information collected is not expected to be disseminated directly to the public, results may be used in scientific, management, technical or general informational publications. Should NOAA Marine Debris Program decide to disseminate the information, it will be subject to the quality control measures and pre-dissemination review pursuant to Section 515 of Public Law 106-554.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological techniques or other forms of information technology.
The data will be collected via a mail survey using address-based sampling. The data collection does not use automated, electronic, mechanical, or other technological techniques or other forms of information technology.
The research team believes that a mail-only survey mode offers the best opportunity for obtaining a high response rate at a reasonable cost, while allowing for the use of visual aids (i.e., a map of local beaches). Potential alternatives modes include a web-based survey and an in-person survey. However, existing probability-based web panels (e.g., GfK Knowledge Networks) would have inadequate sample sizes at the county level, and the cost associated with fielding an in-person survey would be unreasonable for an exploratory effort. While it would be possible to provide a Web URL that allows mail survey respondents to complete the survey over the internet, recent research has found that providing an internet option in a mail survey does not improve response rates relative to a mail-only approach (Messer and Dillman, 2011; Medway and Fulton 2012).
4. Describe efforts to identify duplication.
A review of the literature did not identify any existing research on the economic impact of marine debris on beach visitors that relied on actual measurements of the quantity of marine debris. While Parsons et al. (2009) include “manual cleaning” and “machine cleaning” variables in a RUM model focused on Texas Gulf Coast beaches, they do not obtain objective measurements of the amount of debris at each beach. Other recent beach recreation models include no measure of marine debris at all, including the Southern California beach model (Hanemann et al. 2004), a model focused on visits to New Jersey, Delaware, and Maryland beaches (Parsons and Massey 2003), and a model focused on visits to San Diego County beaches (Lew and Larson 2005).
5. If the collection of information involves small businesses or other small entities, describe the methods used to minimize burden.
The proposed information collection will focus on households and will not impact small businesses or other small entities.
6. Describe the consequences to the Federal program or policy activities if the collection is not conducted or is conducted less frequently.
If the information collection is not conducted, then the Program will have difficulty moving forward with a research program aimed at advancing our knowledge concerning the economic impacts of marine debris on the United States economy. The studyis a necessary step towards this goal, as it allows the Program to assess strengths and weaknesses of using a RUM travel cost model to estimate economic gains and losses for beach visitors.
7. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines.
The proposed information collection will be conducted in a manner that is consistent with OMB guidelines.
8. Provide information on the PRA Federal Register Notice that solicited public comments on the information collection prior to this submission. Summarize the public comments received in response to that notice and describe the actions taken by the agency in response to those comments. Describe the efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
A Federal Register Notice published on March 11, 2013 (78 FR 15355) solicited public comment. Only one comment was received: an individual requested a draft version of the survey instrument, and a draft was provided to that individual.
In addition to the Federal Register notice, comments on the survey materials were solicited from the following persons outside the agency:
Dr. Timothy Haab, Professor and Chair, Department of Agricultural, Environmental, and Development Economics, The Ohio State University.
Dr. Christopher Leggett, consultant to Industrial Economics, Incorporated, Cambridge, Massachusetts.
Nine residents of Orange County, who were interviewed during two focus group discussions (total of nine participants across both groups) conducted in Irvine, California in February 2013 (see details in Part B, Question 4).
Each person was asked to provide feedback on survey design, including length and clarity of instructions.
9. Explain any decisions to provide payments or gifts to respondents, other than remuneration of contractors or grantees.
A monetary incentive of $2 will be provided with the survey materials. Previous survey research (Dillman 2009) suggests prepaid financial tokens are one of the greatest contributions to an increased response rate. It has been demonstrated that a financial token may pull in responders that may otherwise not be interested in participating in the survey (Groves et al. 2006); an issue that is of particular relevance to non-response bias. Moreover, Dillman explains (2009, p 241) that a financial token paid upfront rather than after survey completion turns the action from financial exchange to a social exchange, overcoming the problem of establishing a price point for completion of a survey. Two dollars has been shown to be one of the most effective levels of incentive and are widely used in survey implementation (Lesser et al. 2001). Therefore the mailing will include $2 in cash as an unconditional incentive for completion of the short questionnaire to encourage response from this population.
10. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation, or agency policy.
NOAA will not collect any identifying information about survey respondents other than a household address. All addresses will be removed from the database (with the exception of zip code) after NOAA has calculated, for every household, the driving distance to every beach in the local area. These distance calculations are necessary for the estimation of the RUM travel cost model. All travel distances will be rounded to the nearest half-mile so that individual households cannot be identified ex post through trilateration-type procedures.
The survey materials will include a statement that the respondent’s name will be removed from NOAA’s database after NOAA receives the completed questionnaire. In addition, the survey materials will state that all information provided “will remain confidential to the extent permitted by law.” No other confidentiality assurances will be provided to the respondent.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.
No questions of a sensitive nature will be asked.
12. Provide an estimate in hours of the burden of the collection of information.
The proposed collection involves two one-time surveys: the primary survey and the non-respondent follow-up survey.
Primary Survey: This is a general population mail survey will be sent to 4,000 households. Assuming a 30 percent response rate, there will be 1,200 respondents and 2,800 non-respondents.3
Non-respondent Follow-Up Survey: This survey will be mailed to a sample of 600 households that did not respond to the primary survey. Assuming a 20 percent response rate, this survey will have 120 respondents and 480 non-respondents.
Based on focus groups, we assume that each respondent will spend 20 minutes completing the primary survey, and that non-respondents will spend five minutes completing the non-respondent follow-up survey. Thus, we estimate the total burden of this collection to be 410 hours (Table 1). This is a one-time data collection, so there will be no additional costs expected for respondents.
Table 1. Total Estimated Burden
Survey |
Responses |
Completion Time |
Burden Hours |
Primary Survey |
1,200 |
20 minutes |
400 |
Non-respondent Follow-Up Survey |
120 |
5 minutes |
10 |
TOTAL |
410 |
The Bureau of Labor Statistics reports a mean hourly wage for all occupations for the Santa Ana-Anaheim-Irvine, California Metropolitan Division of $24.81 (Bureau of Labor Statistics 2013). Multiplying the 410 burden hours by this mean hourly wage yields a total labor cost of $10,172.10 (Table 2).
Table 2. Total Estimated Labor Cost
Activity |
Sector |
Annual Number of Responses |
Total Annual Burden Hours |
Dollar Value Per Burden Hour |
Total Labor Cost |
Completing Survey |
Private Individuals |
1,320 |
410 |
$24.81 |
$10,172.10 |
13. Provide an estimate of the total annual cost burden to the respondents or record-keepers resulting from the collection (excluding the value of the burden hours in Question 12 above).
There will be no recordkeeping/reporting costs resulting from the collection.
14. Provide estimates of annualized cost to the Federal government.
The total annualized cost to the Federal Government is $291,284.48. This total cost is comprised of two components:
(1) Operational Expenses: All operational costs will be incurred by the contractor, Industrial Economics, Incorporated. The contract with Industrial Economics is for $285,004.48, which includes survey design and testing, survey implementation, data analysis, and reporting.
(2) Labor Costs for Staff: The estimated time required for Marine Debris Program staff to oversee the information collection is 80 hours at a Grade/Step of AAAS Science and Technology Policy Fellow and an hourly rate of $78.50 (including overhead), resulting in total labor costs for staff of $6,280.
15. Explain the reasons for any program changes or adjustments.
This is a new program.
16. For collections whose results will be published, outline the plans for tabulation and publication.
Statistical summaries of responses to all survey questions will be developed, including the mean, minimum, maximum, and standard deviation for questions with numerical responses; and response frequencies for questions with categorical response options. In addition, responses related to beach visits will be analyzed within the context of a RUM travel cost model as described in detail in Part B of this supporting statement.
The overall schedule for the study is as follows:
Measure marine debris at beaches July and August 2013
Print and coordinate survey materials August 2013
Implement survey November 2013 to December 2013
Analyze results and develop report January 2013 to March 2014
The project report will be posted online on the NOAA marine debris program website (http://marinedebris/noaa.gov) in pdf format.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.
The expiration date for OMB approval will be displayed on all surveys associated with this information collection.
18. Explain each exception to the certification statement.
There are no exceptions to the certification statement.
1 While reductions in marine debris may reduce marine debris removal costs incurred by local municipalities (e.g., beach raking costs), reductions in these costs could not be used to assess benefits, as (1) some beaches are not cleaned at all by local municipalities and (2) the behavior of these municipalities will not necessarily reflect the preferences of beach visitors.
2 Additional details regarding the RUM model are provided in Part B of the Supporting Statement.
3 Link et al. (2008) achieved response rates of approximately 20 to 37 percent in an address-based mail survey using the CDSF.
File Type | application/msword |
File Modified | 2013-10-25 |
File Created | 2013-10-21 |