U.S. Department of Commerce
National Oceanic & Atmospheric Administration
Assessing Public Preferences and Values to Support Coastal and Marine Management
OMB Control No. 0648-0829
Up-to-date socio-economic data is needed to support the individual NERR sites’ conservation and management goals. The initial surveys will be conducted for the Chesapeake Bay National Estuarine Research Reserve in Virginia (CBNERR-VA), Weeks Bay NERR (WBNERR), and Grand Bay NERR (GNDNERR), and the survey will be repeated regularly in other NERRs based on information needs and budget. This information will be used by NOAA and others to:
understand who visits the NERRs,
identify impacts of management decisions,
inform management decisions, and
extend education and outreach efforts.
NOAA has a vested interest in this research as it supports comprehensive ocean and coastal planning and management. The data collected and resulting products are beneficial for NOAA, resource managers, and policy-makers.
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
NOAA’s mission is to provide science, service, and stewardship for, among other activities, the management of the nation’s oceans and coasts. NOAA supports “comprehensive ocean and coastal planning and management” to facilitate the use of oceans and coasts, ensuring “continued access to coastal areas, sustained ecosystems, maintained cultural heritage, and limited cumulative impacts.” NOAA is subject to and supports mandates of the Coastal Zone Management Act (CZMA) (16 U.S.C. § 1452 (303)(2)(D)), which encourages the wise use of coastal resources, including activities associated with energy generation. The CZMA also encourages the inclusion and participation of the public in carrying out the tenets of the act (16 U.S.C. § 1452 (303)(4)). The National Environmental Policy Act (NEPA) (42 U.S.C. § 4332(2)(A) mandates Federal agencies use social science data to assess the impacts of Federal actions on the human environment.
The National Estuarine Research Reserve System (NERRS) is a federal-state partnership program for the stewardship, education, and research of unique estuarine sites. This data collection supports the NERRS’ vision of establishing healthy estuaries and coastal watersheds where human and ecological communities thrive. The NERRS has identified five priority research areas, including a focus on social science and economic processes within each NERR site. However, limited data exists characterizing stakeholder opinions, values, attitudes, and behaviors, including their spatial aspects.
To support these efforts, there is a need for the agency to collect socioeconomic data to:
Specific justification for both surveys are included below (see Enclosure 5 for the full list of survey questions):
“York River Outdoor Recreation Survey” Question Justifications
Section/Question |
Justification/Data use |
Literature (if relevant) |
Section 1 The purpose of this section is to gather information on trip details such as group size, visit frequency, and time spent in the park. These details are important for creating a visitor profile for the York River and its surrounding areas. |
||
Q1. Looking at the map above, did you take any trips to the York River or any surrounding park or natural area for outdoor recreation within the last 12 months? |
This question determines whether this survey respondent has visited the study site within the past year to be sufficient for the recreation questions. |
|
Q2. Approximately, how many trips did you take to the York River or any surrounding park or natural area for outdoor recreation within the last 12 months? |
Prior visits are an important metric in understanding visitor behavior, attitudes, and preferences. This information will help managers build their visitor profile. Additionally, trip frequency data supports economic valuation efforts using the travel cost method, which considers trip expenditures and visitor demographics. |
|
Q3. When was the last time you took a trip to the York River or any surrounding park or natural area for outdoor recreation? |
Knowing when respondents last visited the area provides context for subsequent survey questions. This information helps determine the recency and seasonal aspects of their visit, which are important for interpreting their responses. |
|
Q4. Including yourself, how many people were in your personal group on this trip? |
Visitor group size and member ages are used to estimate overall visitation to a recreation site and helps managers plan for infrastructure needs based on travel party dynamics. The definition of the “personal group” helps differentiate between the travel party and personal group. |
|
Q5. Including yourself, how many of these people were at least 18 years old? |
Research indicates that adults and youths prefer park amenities differently (Baran et al., 2014) |
|
Q6. How many of these people were at least 65 years old? |
||
Q7. If there were any children in your personal group, how many were under 5 years old? |
||
Q8. Approximately, how much time did you spend specifically within the York River or any surrounding park or natural area during this trip? |
This information is important to understand the overall trip characteristics of park visitors and helps estimate and characterize overall visitation to the study area. |
|
Section 2 The purpose of this section is to identify the specific locations visited by respondents around the York River, activity participation, technology use, and transportation methods. This information supports managers in planning infrastructure improvements, adding new activities, and monitoring trends over time. |
||
Q9. Referring to the map on page 2, Did you or your personal group visit any of the following locations on this trip? |
This information will help park managers determine the most popular locations around the York River. Subsequent questions ask for details about this trip, because it's important to know where respondents visited to interpret their answers correctly. |
|
Q10. Did you participate in any of the following activities within the York River or any surrounding park or natural area during this trip? |
This information will help park managers understand the most popular activities to improve the visitor experience and to ensure visitors are aware of all available activities. This information is also useful for park managers to prioritize their budget for amenities that are important for their visitors. |
Recreation visitors could participate in various activities, and activities they participate in could determine how much time they spend in outdoor recreation activities (Grooms et al., 2020). |
Q11. When participating in water-based activities, such as swimming, kayaking, or boating, did you bring a mobile device, such as a smartphone, tablet, or smartwatch, with you? |
This information will be used to assess visitor use of and reliance on technology in natural settings. Park managers can use this data to determine if they need to improve technological connectivity within their park. |
According to the 2017 Virginia Outdoor Demand Survey, 81.5% of respondents used a smartphone during outdoor recreation activities (Ellis et al., 2017). |
Q12. When participating in land-based activities, did you bring a mobile device, such as a smartphone, tablet, or smartwatch, with you? |
||
Q13. If you brought a mobile device and kept it on, did you use it for any of the following reasons? |
||
Q14. Did you or your personal group use any of the following forms of transportation to reach your primary destination on this trip? |
The purpose of this question is to help estimate visitor access and transportation preferences within the park. Park managers are particularly interested in the use of public transportation options. |
|
Section 3 The purpose of this section is to identify visitor spending habits and visitor experiences around the York River. This information is important for estimating recreation values and informing management decisions aimed at enhancing visitor experiences and park infrastructure. |
||
Q15. Did you purchase a Virginia State Park annual pass in the last 12 months? |
This information will help managers understand the types of passes visitors use. Monitoring pass usage can help managers understand if specific passes are more commonly used than others and whether patterns change over time.
This information will be used to develop a travel cost model to understand the recreation value of the area. |
|
Q16. How much did you spend on your Virginia State Park annual pass in the last 12 months? |
|
|
Q17. Did you purchase a Virginia hunting or fishing license in the last 12 months? |
|
|
Q18. How much did you spend on your Virginia hunting or fishing license(s) in the last 12 months? |
|
|
Q19. Please estimate how much your personal group spent on the following items during this trip. If no money was spent on an item, please mark it as $0. |
This information will be used to develop a travel cost model to understand the recreation value of the area. |
|
Q20. How important to you were the following features when deciding to take this trip? |
This information can be used to inform management decisions by identifying both strengths and areas for improvement. Managers can leverage this data to plan effectively for the future. |
The features provided in the survey were modified from motivations provided in various State Comprehensive Outdoor Recreation Plans (SCORP) (e.g., Rushing et al., 2021; Strickler et al., 2018). |
Q21. Looking at this same list of features, how satisfied were you with the quality of each of the following features on this trip? |
||
Section 4 The purpose of this section is to estimate future visitation and identify potential barriers that may affect visitation to the York River and surrounding areas for outdoor recreation. This information will enable managers to develop targeted strategies to attract visitors and enhance site offerings effectively. |
||
Q22. Do you intend to visit the York River or any surrounding parks or natural areas for outdoor recreation in the future? |
This information can be used to understand visitation expectations and identify factors that may deter visitation. Managers can leverage this data to better attract and customize their sites to meet the preferences of visitors.
|
The potential barriers provided in the survey were modified from those provided in various State Comprehensive Outdoor Recreation Plans (SCORP) (e.g., Rushing et al., 2021; Strickler et al., 2018). |
Q23. Regardless of how you answered the question above, which of the following are reasons why you may not visit the York River or surrounding areas for outdoor recreation in the next 12 months? |
||
Section 5 The purpose of this section is to characterize the demographic profile of respondents. This data will be used for weighting to ensure representativeness within the target population and for developing subgroup analyses. Understanding the park user community is essential for effective management and planning. |
||
Q24 - 36 |
These are demographic questions. See section 5 justification above. |
|
Q27. When visiting an area for outdoor recreation, what language do you prefer information to be in? |
This information can be used to assess whether current park infrastructure meets visitors' language needs. Managers can use this information to make decisions about translating signage and other materials, as well as training or hiring staff proficient in specific languages to enhance communication with park visitors. |
|
“Public Perceptions of Prescribed Fires” Question Justifications
Section/Question |
Justification/Data use |
Literature (if relevant) |
Section 1 The purpose of this section is to understand baseline awareness, direct experiences, and self-perceived knowledge of prescribed fires. This information will be used to understand how these factors influence public perceptions and support levels for prescribed fire practices. |
||
Q1. Before today, had you heard about prescribed fires within the Gulf Coast region of Alabama and Mississippi? |
The information will be used to assess baseline awareness of prescribed fires, which may influence perceptions and support levels. Managers can use this information to tailor outreach and educational efforts to enhance public understanding. |
Research suggests that awareness and experience with prescribed fires are associated with respondents' positive attitudes and support of prescribed fires (Wu et al., 2022; Jacobson et al., 2001). |
Q2. Have you had any of the following experiences with prescribed fires within the Gulf Coast region of Alabama and Mississippi within the last five years? |
This information will be used to assess how direct experiences with prescribed fires shape respondents' perceptions and support levels. Managers can use this information to gauge public perception and refine prescribed fire management strategies accordingly. |
|
Q5. In general, how knowledgeable do you feel about prescribed fires? |
This information will be used to assess how self-perceived knowledge levels with prescribed fires shape respondents' perceptions and support levels. Managers can use this information to identify gaps in understanding and inform strategies for public education and outreach. |
Research indicates that knowledge plays a crucial role in shaping public support for prescribed burning (Blanchard and Ryan, 2003; McCaffrey, 2009). Individuals with understandings of prescribed burning are less likely to express concerns about its impact on aesthetics, air quality, and wildlife (McCaffrey, 2009). Public perceptions and acceptance of prescribed fires vary widely across regions (Fried et al., 2006), between urban and rural residents (Rosen et al., 2022), and among local residents versus visitors (Jacobson et al., 2001; Thapa et al., 2008). |
Section 2 The purpose of this section is to understand public opinions and perceptions regarding prescribed fires. This information, analyzed alongside variables like awareness, beliefs, and concerns, will help researchers understand complex relationships influencing community perceptions of prescribed fire practices. Managers can use this information to refine communication strategies and educational outreach to enhance community awareness and acceptance of prescribed fires in the region, informing site management decisions. |
||
Q6. Given what you currently know about prescribed fires, what best describes your opinion about prescribed fires within the Gulf Coast region of Alabama and Mississippi? |
This information will be used to understand current opinions towards prescribed fires and to examine the relationships between other variables and support level. Managers can use this information to tailor communication strategies and educational outreach efforts aimed at improving community awareness and acceptance of prescribed fires. |
|
Q7. Even if you have not made up your mind, which way are you leaning? |
|
|
Q8. Prescribed fires are one strategy to achieve management outcomes on public lands. How important are the following management outcomes to you? |
This information will be used to assess the importance respondents place on various management outcomes achievable through prescribed fires. Managers can use this information to prioritize management strategies that align with public values and priorities, thereby enhancing community support. |
Prescribed burning is a well-established forest management tool used to achieve various goals such as fuel reduction, wildlife habitat improvement, and enhancing forest aesthetics and accessibility (USDA Forest Service, 2018). However, it carries risks such as fire escape and potential impacts on air quality (Rosen et al., 2022). Public support for prescribed burning largely hinges on perceptions of its benefits and risks (Ascher et al., 2012). For example, a study in Texas highlighted that those who perceived prescribed fire as beneficial for wildlife habitat were more supportive of its use (Rideout et al., 2003). Additionally, rural residents often express concerns about fire control, whereas urban residents are more sensitive to smoke exposure (Rosen et al., 2022). |
Q9. Thinking about the same list and given what you currently know about prescribed fires, how effective do you think prescribed fires are at achieving these management outcomes? |
This information will be used to understand respondents' perceptions of the effectiveness of prescribed fires in achieving specific management outcomes. Managers can use this information to tailor communication strategies and demonstrate the efficacy of prescribed fire as a management tool, addressing public perceptions and enhancing support for its implementation. |
|
Q11. Given what you currently know about prescribed fires, how would you rate your concern about the following potential effects? |
This information will be used to assess respondents' concerns about various potential effects of prescribed fires. Managers can use this information to understand public apprehensions and develop strategies to mitigate perceived risks, thereby fostering greater community acceptance and support for prescribed fire programs. |
|
Section 3 The purpose of this section is to understand public perceptions of information provision and trust in management efforts related to prescribed fires. This information will inform strategies to enhance communication effectiveness, improve transparency, and build community trust, ultimately fostering greater understanding, acceptance, and support for prescribed fire practices in the region. |
||
Q3. Do you believe enough information is provided to residents about prescribed fires within the Gulf Coast region of Alabama and Mississippi? |
This information will be used to assess public perceptions regarding the adequacy of information dissemination about prescribed fires. Managers can use this information to improve communication efforts and ensure residents are well-informed, thereby fostering greater community understanding and acceptance of prescribed fires. |
Trust in the land management agency, including trust in information timeliness and accuracy, is often as important as knowledge, beliefs, and concerns in shaping public acceptance and support for prescribed fires (McCaffrey, 2009; Nelson et al., 2004; Dupéy and Smith, 2018; Vaske et al., 2007; Wu et al., 2022). For example, homeowners in Minnesota and Florida viewed prescribed fires favorably when conducted by knowledgeable professionals familiar with local ecology and fire behavior (Nelson et al., 2004). Similarly, Colorado residents in the wildland-urban interface showed more positive attitudes towards prescribed fires when they trusted the public land agency to provide accurate information (Vaske et al., 2007). |
Q4. How timely do you believe information is provided to residents about prescribed fires within the Gulf Coast region of Alabama and Mississippi? |
This information will be used to assess public perceptions regarding the timeliness of information dissemination about prescribed fires. Managers can use this feedback to enhance the timeliness of communication efforts, ensuring residents receive timely updates and information about prescribed fires, which can build trust and support among the community. |
|
Q10. Thinking about the same list again, how much do you trust that managers within the Gulf Coast region of Alabama and Mississippi are doing their best to achieve these management outcomes? |
This information will be used to assess public trust in managers' efforts to achieve prescribed fire management outcomes. Managers can use this insight to strengthen transparency, accountability, and engagement with the community, thereby building trust and enhancing support for prescribed fire initiatives. |
|
Q12. Thinking about the same list, how much do you trust that managers within the Gulf Coast region of Alabama and Mississippi are doing their best to minimize the potential effects of prescribed fires? |
This information will be used to assess public trust in managers' efforts to mitigate the potential effects of prescribed fires. Managers can use this feedback to improve communication on mitigation strategies, demonstrate proactive efforts, and build confidence in their ability to minimize adverse impacts, thereby fostering greater public acceptance and support for prescribed fire practices. |
|
Section 4 The purpose of this section is to characterize the demographic profile of respondents. This data will be used for weighting to ensure representativeness within the target population and for developing subgroup analyses. Understanding how different populations perceive prescribed fires is crucial for effective planning and decision-making. |
||
Q13 - 25 |
These are demographic questions. See section 4 justification above. |
|
Q16. When visiting an area for outdoor recreation, what language do you prefer information to be in? |
This information can be used to assess whether current park infrastructure meets visitors' language needs. Managers can use this information to make decisions about translating signage and other materials, as well as training or hiring staff proficient in specific languages to enhance communication with park visitors. |
|
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
Survey respondents will be recruited by mail and offered the options to complete the survey via paper (mail-back) or online via a URL or QR code. In the pretest, a substantial number of respondents (18%) responded by mailing back a paper survey and 82% responded online. A similar response pattern is expected for the full survey implementation. The pretest report is submitted as a supplementary document.
Advantages of an online survey for the federal government include ease of data collection, lower costs, and automation in data input and handling. Disadvantages include the absence of an interviewer and inability to reach select populations. According to the 2021 ACS1, an estimated 91.4% (±1.5%) of the households in York County, Virginia, 84.6% (±1.5%) of the households in Mobile County, Alabama, 88.2% (±1.0%) of the households in Baldwin County, Alabama, 78.0% (±0.4%) of the households in Jackson County, Mississippi, and 86.3% (±0.9%) of the households in Escambia County, Florida had a broadband internet subscription (US Census Bureau, 2021). Based on the pretest data, around 80% of responses are expected to come from online survey administration (see Part B, Section 3 for more information on maximizing response rates and dealing with nonresponse).
The survey administration tool was developed to minimize burden for respondents and to minimize response bias, while maximizing response rate and data quality, based on best practices for online survey research.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Question 2
Researchers reviewed scholarly works and consulted with local partners to identify any duplication of effort. There have been very few recreation studies based in the Chesapeake Bay region within the last 20 years. For example, Cottrell (2002) and Lipton (2004) conducted surveys of boaters in the Chesapeake Bay to understand responsible environmental behaviors and to value water quality improvements, respectively. Recently, Kane et al. (2021) used webcam and unmanned aerial vehicle imagery to analyze spatial and temporal beachgoer behaviors in Virginia Beach, examining the impacts of COVID-19. In another study, Usher (2021) conducted an online survey of surfers in Virginia and North Carolina to investigate perceptions of beach nourishment. Likewise, there are limited studies based on the public perceptions of fire around the Gulf Coast. For example, Jacobson et al. (2001) surveyed rural and suburban Florida residents; Rideout et al. (2003) surveyed recreational area visitors in eastern Texas; Fried et al. (2006) surveyed residents from California, Michigan, and Florida; Thapa et al. (2008) surveyed Florida tourists; and Jarrett et al. (2009) surveyed non-industrial private forest land owners from various states in the Southern United States.
Therefore, according to our literature review and discussions with local partners, our survey is not a duplication. However, each of the above studies have been used to inform the development of the proposed survey instrument, including ecosystem service selection and scenario development. We have also formed partnerships with ongoing and planned research efforts so that we can leverage resources and provide complementary information.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
This collection involves residents. It does not involve small businesses or other small entities.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
If this data collection is not conducted, NOAA along with state and local government agencies will not have the information required to fulfill evaluation requirements outlined in NEPA and CZMA. Further, the absence of updated socio-economic information would limit the NERR's capacity to assess the social impacts of management proposals and evaluate the effectiveness of existing management actions.
Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with OMB guidelines.
Data collection will be consistent with OMB guidelines.
This collection will not require respondents to report information to the agency more often than quarterly.
It will not require respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it.
It will not require respondents to submit more than an original and two copies of any document.
The collection will not require respondents to retain records, other than health, medical, government contract, grant-in- aid, or tax records for more than three years;
The collection is designed to produce valid and reliable results that can be generalized to the universe of study.
It will not require the use of a statistical data classification that has not been reviewed and approved by OMB.
The collection does not include a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use.
The collection does not require respondents to submit proprietary trade secrets, or other confidential information.
If applicable, provide a copy and identify the date and page number of publications in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Two Federal Register Notices published on August 25, 2023 (87 FR 59781) and February 1, 2024 (89 FR 7377) solicited public comments. No comments were received.
Consultation
As a part of project scoping and development, NOAA consulted with 21 individuals from the following stakeholder groups to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. Those stakeholder groups include:
Gulf study: WBNERR, GNDNERR, South Alabama Land Trust, Mississippi-Alabama Sea Grant, Alabama Cooperative Extension System, and the Mississippi State University Coastal Marine Extension Program.
Coastal Virginia study: CBNERR-VA, Virginia Coastal Zone Management Program, Virginia Institute of Marine Science, Virginia Department of Wildlife Resources, Middle Peninsula Planning Access Authority, PlanRVA, Crater Planning District Commissions, Hampton Roads Planning District Commission, Gloucester County Parks and Rec, York County Parks, Rec, and Tourism, and several local, state, and national parks, including Machicomoco State Park, York River State Park, Colonial National Park, Captain John Smith Chesapeake National Historic Trail, New Quarter Park, Back Creek Park, and Gloucester Point Beach Park.
From these individuals, we received review relative to: survey length; appropriate mode of survey administration (i.e., mail-back versus online administration); problematic survey items in terms of utility, clarity, etc.; item order on the survey instrument; item format and presentation; and opportunities to leverage this survey with previous or existing research efforts. Feedback from these consultations was used to better understand, anecdotally, public sentiment regarding the issues as well as the type of data already available on relevant topics, along with data needs from the perspective of local and regional agencies. Information from these consultations was used during project scoping and development and to revise and improve the survey instrument.
Following publication of the 30-day Federal Register notice, one public comment was submitted. The comment contained issues outside the purview of this information collection, and only those pertinent comments are included below.
Response: Thank you for your comment. While we understand concerns about the federal budget, this survey is a cost-effective way to support coastal economies by informing resource management, tourism, and public access strategies that sustain jobs and protect natural assets. Collecting these data now also helps fulfill federal mandates and ensures local decisions are based on accurate and timely public input, potentially avoiding higher costs in the future.
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
We plan to offer a $2 non-contingent incentive to respondents along with the survey invitation. Based on the results of the pretest, the incentive helped motivate respondents to answer the survey. Based on prior research, the incentive is expected to increase response rates by at least 10%.
Unconditional (prepaid/non-contingent/up-front/token) incentives are received with the survey invitation itself. Unconditional incentives are based on social exchange theory/altruism, and assume that because the recipient as received a gift, they will reciprocate and complete the survey (Dillman et al. 2014). The incentive creates a feeling of goodwill and encourages the respondent to read and consider the request, and ultimately complete the survey.
Multiple meta-analyses suggest that unconditional incentives are more likely to induce participation than are conditional incentives (Church, 1993; Edwards et al., 2009; Young et al., 2015; Robbins et al., 2018; Rao et al., 2020), although the effect may vary and may be confounded across survey modes (Cook et al., 2000; Mercer et al., 2015). For example, in surveys recruited only by web-based or electronic means, it is not possible to give cash up-front to a participant, although digital payment methods may help close this gap (Neman et al., 2022).
Unconditional incentives positively influence response rates, although the magnitude of this effect varies. Porter & Whitcomb’s (2011) suggests that up-front cash (from 50 cents to $5) increases response rate anywhere from 8.7 to 24 percentage points. Dillman et al. (2014) found that $2 incentives may boost response rates by 12 to 31 percentage points. Butler et al. (2017) found that a $2 incentive increased the response rate by 13 percentage points in a survey of family forest owners.
In mail surveys, unconditional incentives had the greatest per-dollar effect on response rate compared to other types of incentives, but the impact of each additional pre-paid dollar is non-linear and quickly asymptotic – the so-called “dose response.” Beyond the first dollar, each additional dollar has a smaller positive effect on response rates (Mercer et al., 2015). Gneezy & Rey-Biel (2014) tested unconditional incentives from $1 to $30 and found that an unconditional incentive greater than $8 does not result in further gains in response rates. Other NOAA surveys (e.g. NOAA’s Fishing Effort Survey) typically include a $2 incentive (Andrews et al. 2014, Anderson et al. 2021, Carter et al. 2024). Additionally, unconditional incentives may reduce non-response bias by increasing the propensity of some respondents to respond, although there is mixed empirical support for this theory (Groves et al. 2006, Oscarsson & Arkhede 2019). Recent work has also explored making unconditional incentives more obvious- this can improve response rate by 1 to 4 percentage points (Debell et al. 2020, Debell 2023, Zhang et al. 2023). Increasing the visibility of the unconditional incentive is presumed to increase the likelihood a respondent will open the initial invitation letter, and thus respond to the survey.
The last question in the pretest survey asked if respondents had any additional comment. Three respondents expressed their gratitude for the incentive. One respondent wrote:
“You might like to know that I completed this survey because of the cash incentive. I appreciate that you were willing to risk even a few dollars per response, and provided the incentive without requiring me to activate cards or submit forms.”
Although anecdotal, this comment illustrates how unconditional incentives function to elicit responses.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If the collection requires a systems of records notice (SORN) or privacy impact assessment (PIA), those should be cited and described here.
A contracted survey vendor will handle the procurement of an address frame and administration of the survey. NOAA will ensure that the vendor implements strict security and privacy measures, including storing Personally Identifiable Information (PII) on secure servers rather than portable devices and protecting local computers with passwords in a locked facility. Mailing addresses will be used exclusively by the vendor for distributing survey materials and will not be linked to individual responses. PII will not be stored with survey data; instead, each sample unit will be assigned a unique PIN. The Privacy Act does not apply since records will not be retrieved by PII. The public will be informed about how their information will be used on the survey instrument. Access to raw, de-identified data will be limited to project managers and lead analysts, and any released datasets will be aggregated to maintain confidentiality.
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
No questions of a sensitive nature will be asked during this data collection
Provide estimates of the hour burden of the collection of information.
The table below provides an estimate of burden hours by data collection phase (see Part B.1 for more details. For the full survey implementation, we estimate 2,229 respondents. Each survey is expected to take approximately 10 minutes, including time for reading instructions, reviewing questions, and completing the survey instrument. These estimates are based on the question types, survey length, and the researchers’ experience with similar surveys.
Information Collection |
Type of Respondent |
#
of Respondents/year |
Annual
# of Responses / Respondent |
Annual
# of Responses |
Burden
Hrs / Response |
Annual
Burden Hrs |
Hourly Wage Rate (f) |
Total
Annual Wage Burden Costs |
Full Implementation |
||||||||
York River Outdoor Recreation Survey |
Individuals |
1,716 |
1 |
1,716 |
10 min |
286 |
47.20 |
13499.20 |
Public Perceptions of Prescribed Fires |
Individuals |
513 |
1 |
513 |
10 min |
86 |
47.20 |
4059.20 |
Total |
|
|
|
2,229 |
|
372 |
|
17,558.40 |
*The mean average for Civilian Workers on the BLS 2024 National Occupational Employment and Wage Estimates was used in order to encompass the broad range of occupations in the respondent pool. https://www.bls.gov/news.release/pdf/ecec.pdf
Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).
There are no capital costs or operating and maintenance costs associated with this information collection. No additional cost burden will be incurred by respondents beyond response time.
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.
The Commerce Alternative Personnel System (CAPS) pay tables for the Rest of U.S. locality were used to determine the base salary. (https://www.commerce.gov/sites/default/files/2024-01/CAPS_rpStandard_2024.pdf) The Rest of U.S. locality was used since NOAA employees are geographically dispersed. A multiplier of 1.5 was used to calculate the loaded salary. The pay tables have not been updated since the last submission for this collection.
Cost Descriptions |
Grade/ Step |
Loaded Salary /Cost |
% of Effort |
Fringe (if Applicable) |
Total Cost to Government |
|
Federal Positions |
ZA–IV |
$238,290 |
25% |
|
$59,572.50 |
|
Contractor Cost |
|
|
|
|
|
|
Survey Vendor Cost breakdown |
Printing |
|
$38,000 |
|
|
|
Postage |
|
$15,000 |
|
|
|
|
Address Frame purchase |
|
$10,000 |
|
|
|
|
Incentive |
|
$25,524 |
|
|
|
|
Labor and overheads |
|
311,476 |
|
|
|
|
Total Survey Vendor Costs |
|
$400,000 |
|
|
$400,000 ($133,333 annualized) |
|
Contractor Positions |
|
$75,000 |
|
|
$75,000 |
|
Travel |
|
|
|
|
|
|
Other Costs: |
|
|
|
|
|
|
TOTAL |
|
|
|
|
$267,905.50 |
Explain the reasons for any program changes or adjustments reported in ROCIS.
The York River pretest survey was already conducted and has been removed from this submission. We have added the full implementation of the Outdoor Recreation and the York River Surveys.
Information Collection |
Respondents |
Responses |
Burden Hours |
Reason for change or adjustment |
|||
Current Renewal / Revision |
Previous Renewal / Revision |
Current Renewal / Revision |
Previous Renewal / Revision |
Current Renewal / Revision |
Previous Renewal / Revision |
||
Pretests |
0 |
305 |
0 |
305 |
0 |
51 |
Pretests have been concluded and are being removed from this control number. |
York River Outdoor Recreational Survey - Full Implementation |
1716 |
0 |
1716 |
0 |
286 |
0 |
Full Implementation Survey added. |
Public perceptions of prescribed fires |
513 |
0 |
513 |
0 |
86 |
0 |
Full implementation survey added |
Total for Collection |
2229 |
305 |
2229 |
305 |
372 |
51 |
|
Difference |
1924 |
1924 |
321 |
|
For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Personal
Characteristics Q18-23,
Q30-34
Personal
Use Q3
Data will be collected by a contract vendor and analyzed by the NOAA research team. Final products will be determined based on partner and stakeholder needs. Findings may be presented at professional conferences and published in peer reviewed journals.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The agency plans to display the expiration date for OMB approval of the information collection on all instruments.
Explain each exception to the certification statement identified in “Certification for Paperwork Reduction Act Submissions."
The agency certifies compliance with 5 CFR 1320.9 and the related provisions of 5 CFR 1320.8(b)(3).
Anderson, L., Jans, M., Lee, A., Doyle, C., Driscoll, H., & Hilger, J. (2021). Effects of survey response mode, purported topic, and incentives on response rates in human dimensions of fisheries and wildlife research. Human Dimensions of Wildlife, 27(3), 201–219. https://doi.org/10.1080/10871209.2021.1907633.
Andrews R., Brick J. M., Mathiowetz W. N. A. (2014). Development and Testing of Recreational Fishing Effort Surveys. National Oceanic and Atmospheric Administration, National Marine Fisheries Service, Silver Spring.
Ascher, T. J., Wilson, R. S., & Toman, E. (2012). The importance of affect, perceived risk and perceived benefit in understanding support for fuels management among wildland–urban interface residents. International Journal of Wildland Fire, 22(3), 267-276.
Baran, P. K., Smith, W. R., Moore, R. C., Floyd, M. F., Bocarro, J. N., Cosco, N. G., & Danninger, T. M. (2014). Park use among youth and adults: examination of individual, social, and urban form factors. Environment and Behavior, 46(6), 768-800.
Blanchard, Brian, and Robert L. Ryan. "Community perceptions of wildland fire risk and fire hazard reduction strategies at the wildland-urban interface in the northeastern United States." In Proceedings of the 2003 Northeastern Recreation Research Symposium, vol. 317, pp. 285-294. 2003.
Butler, B. J., Hewes, J. H., Tyrrell, M. L., & Butler, S. M. (2017). Methods for increasing cooperation rates for surveys of family forest owners. Small-Scale Forestry, 16(2), 169–177. https://doi.org/10.1007/s11842-016-9349-7.
Carter D. W. et al. (2024). A comparison of recreational fishing demand estimates from a mail-push versus email-only sampling strategy: Evidence from a Survey of Gulf of Mexico Anglers, North American Journal of Fisheries Management.
Church, A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public Opinion Quarterly, 57(1), 62–79. https://doi.org/10.1086/269355.
Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in Web- or internet-based surveys. Educational and Psychological Measurement, 60(6), 821–836. https://doi.org/10.1177/00131640021970934
Cottrell, S. (2002). Predictive model of responsible environmental behaviour: application as a visitor monitoring tool. Monitoring and management of visitor flows in recreational and protected areas, 129135.
DeBell M., Maisel N., Edwards B., Amsbary M., Meldener V. (2020). Improving survey response rates with visible money. Journal of Survey Statistics and Methodology, 8, 821–831.
DeBell, M. (2023). The visible cash effect with prepaid incentives: Evidence for data quality, response rates, generalizability, and cost, Journal of Survey Statistics and Methodology, 11(5), 991–1010. https://doi.org/10.1093/jssam/smac032.
Dillman, D. A., Smyth, J. D., Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method. John Wiley and Sons Inc.
Dupéy, L. N., & Smith, J. W. (2018). An integrative review of empirical research on perceptions and behaviors related to prescribed burning and wildfire in the United States. Environmental management, 61(6), 1002-1018.
Edwards, P. J., Roberts, I., Clarke, M. J., DiGuiseppi, C., Wentz, R., Kwan, I., Cooper, R., Felix, L. M., & Pratap, S. (2009). Methods to increase response to postal and electronic questionnaires. Cochrane Database of Systematic Reviews, 3(3). https://doi.org/10.1002/14651858.MR000008.pub4.
Ellis, J.M., Yuan, R., He, Y., and Zaslow, S. (2017). 2017 Virginia Outdoors Demand Survey.
Fried, J. S., Gatziolis, D., Gilless, J. K., Vogt, C. A., & Winter, G. (2006). Changing beliefs and building trust at the wildland urban interface. Fire Management Today, 66(3), 51-54.
Gneezy, U., & Rey-Biel, P. (2014). On the relative efficiency of performance pay and noncontingent incentives. Journal of the European Economic Association, 12(1), 62–72. https://doi.org/10.1111/jeea.12062.
Grooms, B., Rutter, J. D., Barnes, J. C., Peele, A., & Dayer, A. A. (2020). Supporting wildlife recreationists in Virginia: Survey report to inform the Virginia Department of Wildlife Resources' wildlife viewing plan. V. Tech.
Groves, R., Couper, M., Presser, S., Singer, E., Tourangeau, R., Acosta, G., and Nelson, L. (2006). Experiments in producing nonresponse bias, Public Opinion Quarterly, 70, 720–736.
Jacobson, S. K., Monroe, M. C., & Marynowski, S. (2001). Fire at the wildland interface: the influence of experience and mass media on public knowledge, attitudes, and behavioral intentions. Wildlife Society Bulletin, 929-937. https://www.jstor.org/stable/3784420
Jarrett, A., Gan, J., Johnson, C., & Munn, I. A. (2009). Landowner awareness and adoption of wildfire programs in the southern United States. Journal of Forestry, 107(3), 113-118.
Johnston, R. J., Grigalunas, T. A., Opaluch, J. J., Mazzotta, M., & Diamantedes, J. (2002). Valuing estuarine resource services using economic and ecological models: the Peconic Estuary System study. Coastal Management, 30(1), 47-65.
Kane, B., Zajchowski, C. A., Allen, T. R., McLeod, G., & Allen, N. H. (2021). Is it safer at the beach? Spatial and temporal analyses of beachgoer behaviors during the COVID-19 pandemic. Ocean & coastal management, 205, 105533. https://doi.org/10.1016/j.ocecoaman.2021.105533
Lipton, D. (2004). The value of improved water quality to Chesapeake Bay boaters. Marine Resource Economics, 19(2), 265-270. https://doi.org/10.1086/mre.19.2.42629432
McCaffrey, S. (2009). Crucial factors influencing public acceptance of fuels treatments. Fire Management Today, 69(1), 9.
Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105–129. https://doi.org/10.1093/poq/nfu059.
Nelson, K. C., Monroe, M. C., Johnson, J. F., & Bowers, A. (2004). Living with fire: homeowner assessment of landscape values and defensible space in Minnesota and Florida, USA. International Journal of Wildland Fire, 13(4), 413-425.
Neman, T. S., Dykema, J., Garbarski, D., Jones, C., Schaeffer, N. C., & Farrar-Edwards, D. (2022). Survey monetary incentives: Digital payments as an alternative to direct mail. Survey Practice, 15(1), 1–7. https://doi.org/10.29115/sp-2021-0012.
Oscarsson, H., & Arkhede, S. (2019). Effects of conditional incentives on response rate, non-response bias and measurement error in a high response-rate context. International Journal of Public Opinion Research. doi:10.1093/ijpor/edz015.
Porter, S. R., & Whitcomb, M. E. (2011). The impact of lottery Incentives on Student Survey Response Rates. Research in Higher Education, 44(4), 389–407.
Rao, N. (2020). Cost effectiveness of pre- and post-paid incentives for mail survey response. Survey Practice, 13(1), 1–7. https://doi.org/10.29115/sp-2020-0004.
Rideout, S., Oswald, B. P., & Legg, M. H. (2003). Ecological, political and social challenges of prescribed fire restoration in east Texas pineywoods ecosystems: a case study. Forestry, 76(2), 261-269. https://doi.org/10.1093/forestry/76.2.261
Robbins, M. W., Grimm, G., Stecher, B., & Opfer, V. D. (2018). A comparison of strategies for recruiting teachers into survey panels. SAGE Open, 8(3). https://doi.org/10.1177/2158244018796412.
Rosen, Z., Henery, G., Slater, K. D., Sablan, O., Ford, B., Pierce, J. R., Fischer, E. V., & Magzamen, S. (2022). A Culture of Fire: Identifying Community Risk Perceptions Surrounding Prescribed Burning in the Flint Hills, Kansas. Journal of Applied Communications, 106(4), 6.
Rushing, B.R., Leavell, M., Nzaku, K., and Black, N. (2021). Outdoor Recreation in Alabama.
Strickler, M.J., Cristman, C.E., and Poole, D. (2018). Virginia Outdoors Plan 2018. Virginia Department of Conservation and Recreation.
Thapa, B., Holland, S. M., & Absher, J. D. (2008). Perceived risk, attitude, knowledge, and reactionary behaviors toward wildfires among Florida tourists. In D. J. Chavez, J. D. Absher, & P. L. Winter (Eds.), Fire social science research from the Pacific Southwest research station: Studies supported by national fire plan funds. (Vol. General Technical Report PSW-GTR-209, pp. 87). USDA Forest Service: Pacific Southwest Research Station.
US Census Bureau. (2021). Computer and internet use in the United States: 2018. Available online: https://www.census.gov/content/dam/Census/library/publications/2021/acs/acs-49.pdf
USDA Forest Service. (2018). Introduction to prescribed fire in southern ecosystems. Research & Development, Southern Research Station. Science Update SRS-054.
Usher, L. E. (2021). Virginia and North Carolina surfers’ perceptions of beach nourishment. Ocean & Coastal Management, 203, 105471. https://doi.org/10.1016/j.ocecoaman.2020.105471
Vaske, J. J., Absher, J. D., & Bright, A. D. (2007). Salient value similarity, social trust and attitudes toward wildland fire management strategies. Human Ecology Review, 223-232.
Wu, H., Miller, Z. D., Wang, R., Zipp, K. Y., Newman, P., Shr, Y.-H., Dems, C. L., Taylor, A., Kaye, M. W., & Smithwick, E. A. (2022). Public and manager perceptions about prescribed fire in the Mid-Atlantic, United States. Journal of Environmental Management, 322, 116100.
Young, J. M., O’Halloran, A., McAulay, C., Pirotta, M., Forsdike, K., Stacey, I., & Currow, D. (2015). Unconditional and conditional incentives differentially improved general practitioners’ participation in an online survey: Randomized controlled trial. Journal of Clinical Epidemiology, 68(6), 693–697. https://doi.org/10.1016/j.jclinepi.2014.09.013
Zhang, S., West, B. T., Wagner, J., Couper, M. P., Gatward, R., & Axinn, W. G. (2023). Visible cash, a second incentive, and priority mail? An experimental evaluation of mailing strategies for a screening questionnaire in a national push-to-web/mail survey. Journal of Survey Statistics and Methodology, 11(5), 1011–1031.
1 U.S. Census Bureau, 2017-2021 American Community Survey 5-Year Estimates
Page | PAGE
13
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Adrienne.Thomas |
File Modified | 0000-00-00 |
File Created | 2025-06-12 |