0648-0829 Supporting Statement B_05092025_clean

0648-0829 Supporting Statement B_05092025_clean.docx

Assessing Public Preferences and Values to Support Coastal and Marine Management

OMB: 0648-0829

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT

U.S. Department of Commerce

National Oceanic & Atmospheric Administration

Assessing Public Preferences and Values to Support Coastal and Marine Management

OMB Control No. 0648-0829


SUPPORTING STATEMENT PART B


B. Collections of Information Employing Statistical Methods

  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



  1. Potential Respondent Universe and Response Rate

Coastal VA - Outdoor Recreation:

The potential respondent universe for this study includes residents aged 18 and over living within a one-hour drive of the York River. The one-hour driving radius for the Coastal Virginia study is based on human mobility data from 2022, which found that roughly two-thirds of visitors to the York River live up to an hour away. The estimated total number of occupied households in this study area is 1,347,170 (US Census Bureau, 2020) and the estimated total population 18 years and over is 2,739,072 (US Census Bureau/ACS, 2020).

Gulf - Prescribed Fires:

The potential respondent universe for this study includes residents aged 18 and over living within Mobile and Baldwin Counties in Alabama and Jackson County in Mississippi. The estimated total number of occupied households in this study area is 299,418 (US Census Bureau, 2020) and the estimated total population 18 years and over is 604,394 (US Census Bureau/ACS, 2020).

All Surveys Combined:

Mail-back surveys typically achieve a response rate of 20-30%1. Recent studies on similar topics have yielded similar response rates (Gómez and Hill, 2016; Schuett et al., 2016; Ellis et al., 2017; Guo et al., 2017; Murray et al., 2020; Knoche and Ritchie, 2022). However, studies have shown that response rates tend to be lower for some populations. Based on these estimates, researchers conservatively anticipate a response rate of 20% to 25% for each survey depending on the socio-demographics of the individual strata.

  1. Sampling and Respondent Selection Method

Data will be collected using a two-stage stratified random sampling design. The study region will be stratified geographically by county and Census tract. Details of the strata are explained below. Within each stratum, households will be selected at random, and within each selected household, the individual with the next upcoming birthday who is aged 18 or older will be requested to complete the survey to approximate random selection. Therefore, the primary sampling unit (PSU) is the household, and the secondary sampling unit (SSU) consists of individuals selected within each household.

The goals of the proposed strata are to ensure spatial representation and allow researchers to examine the influence of geographic proximity on responses. Census tracts with high proportions of hard-to-count communities will be oversampled (see question 2.iii). The maps below show the final sample size for each Census tract for the Coastal Virginia study (Figure 1) and for the Gulf study (Figure 2).

Figure 1: Coastal VA - Outdoor Recreation proposed sample size per Census tract.





Figure 2: Gulf - Prescribed Fires proposed prescribed fire survey sample size per Census tract.

Table 1 provides a breakdown of the tentative estimated number of completed surveys desired for each state, along with the sample size per state. In order to obtain our estimated minimum number of respondents (d), the sample size needs to be increased to account for both non-response (e) and mail non-delivery (f). Estimated non-response rates vary by Census tract and expected response rates are between 20 and 25 percent. Therefore, direct stratum-level calculations cannot be shown in Table 1; however, estimated response rates for each stratum averages around 20% (including the expected increased response rate gained by including a $2 non-contingent incentive). For example, at least 5% of the population within all Census tracts in the Coastal VA study area are within at least one hard-to-count community. Therefore, the expected response rate for each of these Census tracts is 20%. The response rates assume a $2 non-contingent incentive. Given that partial respondents are rounded (e.g., 0.6 respondents would be rounded up to 1 respondent), the sample size is slightly larger than the estimated number of respondents divided by the response rate (i.e., 1,716/0.2 = 8,580 < 8,786). The exact discrepancy will vary.

Note that these response rates assume a $2 incentive (see Section 3). Therefore, based on the statistical sampling methodology discussed in detail in Question 2 below, the estimated response rate, and the 10% non-deliverable rate, the sample size for the final collections will be 12,684. The sample size for the pretests is 305. See Section 2.v. below for more details on determining the minimum sample size.

Table 1: Estimates of sample size by survey


Survey


Strata

Population Estimates


Pretest

Sample

(c)

Estimated Min Number of Pretest Respondents


Estimated Min Number of Respondents

(d)

Sample Size Adjusted for

18+

(a)

Occupied

Households

(b)

~20% RR

(e) = (d) ÷ ~20%

10% Non-

Deliverable Rate

(f) = (e) ÷ (1−10%)

Coastal VA

York River

Census tracts

2,739,072

1,347,170

1,182

213

1,716

8,786

9,762

Gulf - Prescribed Fire

Baldwin, AL

Census tracts

227,131

87,190

157

28

157

794

882

Mobile, AL

Census tracts

414,620

158,045

242

43

242

1,256

1,395

Jackson, MS

Census tracts

142,993

54,183

114

21

114

581

645

SUB-TOTAL

784,744

299,418

513

92

513

2,630

2,922


TOTAL

3,523,816

1,646,588

1,695

305

2,229

11,416

12,684

2. Describe the procedures for the collection of information including: statistical methodology for stratification and sample selection; estimation procedure; degree of accuracy needed for the purpose described in the justification; unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.

  1. Stratification and Sample Selection

A two-stage stratified random sampling design will be used for data collection. First, the study region will be stratified geographically by county and Census tract. Then, residential households will be randomly selected from each stratum using an address-based frame procured from the U.S. Postal Service.

  1. Estimation Procedures

For obtaining population-based estimates of various parameters, each responding household will be assigned a sampling weight. The weights will be used to produce estimates that:

  • are generalizable to the population from which the sample was selected;

  • account for differential probabilities of selection across the sampling strata;

  • match the population distributions of selected demographic variables within strata; and

  • allow for adjustments to reduce potential non-response bias.

These weights combine:

  • a base sampling weight which is the inverse of the probability of selection of the household;

  • a within-stratum adjustment for differential non-response across strata; and

  • a non-response weight.

Post-stratification adjustments will be made to match the sample to known population values (e.g., from Census data).

There are various models that can be used for non-response weighting. For example, non-response weights can be constructed based on estimated response propensities or on weighting class adjustments. Response propensities are designed to treat non-response as a stochastic process in which there are shared causes of the likelihood of non-response and the value of the survey variable. The weighting class approach assumes that within a weighting class (typically demographically-defined), non-respondents and respondents have the same or very similar distributions on the survey variables. If this model assumption holds, then applying weights to the respondents reduces bias in the estimator that is due to non-response. Several factors, including the difference between the sample and population distributions of demographic characteristics, and the plan for how to use weights in the regression models will determine which approach is most efficient for both estimating population parameters.

  1. Degree of Accuracy Needed for the Purpose Described in the Justification

The following formula can be used to determine the minimum required sample size, , for analysis

Where is the z-value required for a specified confidence level (here, 95%), is the proportion of the population with a characteristic of interest (here, p=0.5 conservatively), and is the margin of error (here, 0.05). Therefore,

This means a minimum sample size of 384 is required to be able to test for differences in means at the 95% confidence level with a 5% margin of error. This is met by our sampling plan for the study population, at the state level, and some counties and socio-demographic factors.

  1. Unusual problems requiring specialized sampling procedures

There are no unusual problems requiring specialized sampling procedures.

  1. Use of periodic (less frequent than annual) data collection cycles to reduce burden

Data will not be collected annually from each individual site as we do not anticipate substantive changes in public preferences and values from year-to-year. Secondary data sources, such as human mobility data, may be used to track changes in visitation over time to reduce burden.

3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Implementation Techniques

We will use mail contact and recruitment methods following the Dillman Tailored Design Method (Dillman et al., 2014). Potential respondents will receive multiple mailings, which may include a pre-survey notification postcard, an invitation letter, and follow-up reminders (see Appendix B for postcard and letter text). Respondents will have the option to complete the survey either by paper or online via a URL or QR code.

The final procedures for administering the survey and the design of the survey tool will be determined by the vendor we select. This decision will be based on the vendor’s expertise in maximizing response rates and their experience with similar surveys in the region. For instance, a recent survey conducted by the team in Oregon included an initial invitation letter with a URL, a reminder postcard with a URL, a reminder letter with the questionnaire, and a final letter with the questionnaire.

A key criterion for selecting this vendor will be their established trust within the community, such as partnerships with local university survey centers, which is expected to enhance the response rate (Ladik et al., 2007).

Additionally, the survey will be translated into additional languages to encourage participation by limited English speaking households and reduce the potential for non-response bias (Moradi et al., 2010, Smith, 2007). For Coastal VA - Outdoor Recreation, these languages may include Spanish, Arabic, Korean, and Chinese. For the Gulf surveys, these languages may include Spanish and Vietnamese.

Incentives

Incentives are consistent with numerous theories about survey participation (Singer and Ye, 2013), such as the theory of reasoned action (Ajzen and Fishbein, 1980), social exchange theory (Dillman et al., 2014), and leverage-salience theory (Groves et al., 2000). Inclusion of an incentive acts as a sign of good will on the part of the study sponsors and encourages reciprocity of that goodwill by the respondent.

Dillman et al. (2014) recommends including incentives to not only increase response rates, but to decrease nonresponse bias. Specifically, an incentive amount between $1 and $5 is recommended for surveys of most populations.

Church (1993) conducted a meta-analysis of 38 studies that implemented some form of mail survey incentive to increase response rates and found that providing a prepaid monetary incentive with the initial survey mailing increases response rates by 19.1% on average. Lesser et al. (2001) analyzed the impact of financial incentives in mail surveys and found that including a $2 bill increased response rates by 11% to 31%. Gajic et al. (2012) administered a stated-preference survey of a general community population using a mixed-mode approach where community members were invited to participate in a web-based survey using a traditional mailed letter. A prepaid cash incentive of $2 was found to increase response rates by 11.6%.

Given these findings, we believe a small, prepaid incentive will boost response rates by at least 10% and would be the most cost-effective means to increase response rates. This increased response rate is reflected in the overall 20% expected response rate in Table 2. A $2 incentive was chosen due to considerations for the population being targeted and the funding available for the project.

Based on findings from the pretest of the Outdoor Recreation survey (see Q.4, below), the full implementation will use the $2 non-contingent incentive. For the pretest survey, the response rates were as expected and participants reported that the incentive positively affected response.

Non-response bias analysis

Decreasing survey response rates is a growing concern due to the increased likelihood of non-response bias, which can limit the ability to develop population estimates from survey data. Non-response bias may still exist even with high response rates if non-respondents differ greatly from respondents; however, information on non-respondents is often unavailable. One approach to estimating non-response bias in the absence of this information is the “continuum of resistance” model (Lin and Schaffer, 1995), which assumes that those who only respond after repeated contact attempts (delayed respondents) would have been non-respondents if the data collection had stopped early. Therefore, non-respondents are more similar to delayed respondents than to those who respond quickly (early respondents). Researchers will assess the potential for non-response bias by comparing responses across contact waves. If found, a weighting procedure, as discussed in Section B.1.ii above, can be applied, and the implications towards policy outcome preferences will be examined and discussed.

4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



A small pretest of operations and methodology was conducted for the Outdoor Recreation Survey in Virginia. The pretest sample included 1187 households. Households received an initial invitation with a $2 non-contingent incentive, a reminder postcard, a letter with a paper survey, and a final reminder postcard. Data from the pretest supports the methodology for the full implementation. Further information on the pretest can be found in the report attached here as Enclosure 3.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Consultation on the statistical aspects of the study design was provided by Trent Buskirk ([email protected]).

This project will be implemented by researchers with NOAA’s National Centers for Coastal Ocean Science. The project Principal Investigator is:


Sarah Gonyo, PhD (Lead)

Economist

NOAA National Ocean Service

National Centers for Coastal Ocean Science

Email: [email protected]

Data collection will be contracted out to an external vendor. Data analysis will be conducted by the project principal investigators along with the following research team members, all members of the NOAA National Ocean Service National Centers for Coastal Ocean Science Social Science Team:



REFERENCES

  1. Ajzen, I., & Fishbein, M. (1980). Understanding attitudes and predicting social behavior. Englewood Cliffs, NJ: Prentice-Hall.

  2. Church, A.H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public Opinion Quarterly, 57(1): 62-80.

  3. Dillman, D. A., Smyth, J. D., and Christian, L. M. (2014). Internet, mail, and mixed-mode surveys: The tailored design method. Hoboken, NJ: Wiley.

  4. Ellis, J.M., Yuan, R., He, Y., and Zaslow, S. (2017). 2017 Virginia Outdoors Demand Survey.

  5. Gajic, A., Cameron, D., & Hurley, J. (2012). The cost-effectiveness of cash versus lottery incentives for a web-based, stated-preference community survey. The European Journal of Health Economics, 13, 789-799. DOI: 10.1007/s10198-011-0332-0

  6. Gómez, E. and Hill, E. (2016). First Landing State Park: Participation Patterns and Perceived Health Outcomes of Recreation at an Urban-Proximate Park. Journal of Park and Recreation Administration, 34(1), 68-83. DOI: 10.18666/JPRA-2016-V34-I1-7034

  7. Groves, R.M., Singer, E., and Corning, A. (2000). Leverage-Saliency Theory of Survey Participation: Description and an Illustration. Public Opinion Quarterly, 64(3): 299-308.

  8. Guo, Z., Robinson, D., & Hite, D. (2017). Economic impact of Mississippi and Alabama Gulf tourism on the regional economy. Ocean & Coastal Management, 145, 52-61.

https://doi.org/10.1016/j.ocecoaman.2017.05.006

  1. Knoche, S. and Ritchie, K. (2022). A travel cost recreation demand model examining the economic benefits of acid mine drainage remediation to trout anglers. Journal of Environmental Management, 319: 115485. https://doi.org/10.1016/j.jenvman.2022.115485

  2. Ladik, D. M., Carrillat, F. A., & Solomon, P. J. (2007). The effectiveness of university sponsorship in increasing survey response rate. Journal of Marketing Theory and Practice, 15(3), 263-271.

  3. Lesser, V.M., Dillman, D.A., Carlson, J., Lorenz, F., Mason, R., and Willits, F. (2001). Quantifying the influence of incentives on mail survey response rates and nonresponse bias. Presented at the Annual Meeting of the American Statistical Association, Atlanta, GA

  4. Lin, I., & Schaeffer, N.C. (1995). Using survey participants to estimate the impact of nonparticipation. Public Opinion Quarterly, 59, 236-258. DOI: 10.1086/269471

  5. Moradi T, Sidorchuk A, Hallqvist J. Translation of questionnaire increases the response rate in immigrants: filling the language gap or feeling of inclusion? Scand J Public Health. 2010 Dec;38(8):889-92. DOI: 10.1177/1403494810374220. Epub 2010 Jun 9. PMID: 20534633.

  6. Murray, R., Wilson, S., Dalemarre, L., Chanse, V., Phoenix, J., and Baranoff, L. (2020). Should We Put Our Feet in the Water? Use of a Survey to Assess Recreational Exposures to Contaminants in the Anacostia River. Environmental Health Insights, 9(s2). https://doi.org/10.1177/EHI.S19594

  7. Schuett, M., Ding, C., Kyle, G., & Shively, J.D. (2016) Examining the Behavior, Management Preferences, and Sociodemographics of Artificial Reef Users in the Gulf of Mexico Offshore from Texas. North American Journal of Fisheries Management, 36:2, 321-328. DOI: 10.1080/02755947.2015.1123204

  8. Singer, E. and Ye, C. (2013). The Use and Effects of Incentives in Surveys. The Annals of the American Academy of Political and Social Science, 645: 112-141.

  9. Smith, T.W. (2007) ‘An evaluation of Spanish questions on the 2006 General Social Surveys’. GSS Methodological Report No.109.

  10. US Census Bureau. (2020). "H1: Occupancy Status." 2020 Census Redistricting Data (Public Law 94-171).

  11. US Census Bureau/American Community Survey. (2020). "B01001: Sex By Age." 2015-2020 American Community Survey.



1 Dillman, Don A. Jolene D. Smyth, and Leah Melani Christian. (2009). Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, NJ: John Wiley & Sons, Inc.


2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2025-06-12

© 2025 OMB.report | Privacy Policy